UK urges social platforms not to recommend harmful content to children

UK orders social media sites to impose tougher age checks

UK urges social platforms not to recommend harmful content to children
UK orders social media sites to impose tougher age checks

The UK’s Office of Communications (Ofcom) has urged social media platforms to do more to stop their algorithms from recommending harmful content to children.

According to BBC, Ofcom has warned that social media sites could be banned for children under 18 if the platforms fail to comply with the new online safety rules.

Ofcom has published a new children’s safety draft code of practice that requires social media firms to have tougher age-checking measures.

As per the regulatory platform, there are more than 40 ‘practical measures’ in its codes.

Meanwhile, the spokespeople of Snapchat and Meta clarified that the firms already had extra protections for those under 18, and the apps also offer parental tools.

The Snapchat spokesperson said, “As a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience. We support the aims of the Online Safety Act and work with experts to inform our approach to safety on Snapchat.”

While the Meta spokesperson assured, “Content that incites violence encourages suicide, self-injury or eating disorders break our rules, and we remove that content when we find it.”

Speaking to BBC, Ofcom head Dame Melanie Dawes described new rules as a ‘big moment.’

She noted, “Young people are fed harmful content on their feed again and again, and this has become normalized, but it needs to change.”

Dawes further added, “We will be publishing league tables so that the public knows which companies are implementing the changes and which ones are not.”