Tech companies must ‘tame’ aggressive algorithms under Ofcom’s online safety rules

Social media platforms must take action to prevent their algorithms from recommending harmful content to children, and put in place robust age verification measures to protect them, Ofcom says.

The regulator has published its draft codes of practice for child safety, which sets out how it expects online services to meet their new legal responsibilities to protect children online under the Online Safety Act.

Online safety laws require sites accessed by children to take action to protect those younger users by assessing the risk their platform poses to children and then taking steps to limit those risks – with steep fines as potential punishment for those that are found to be in violation.

As the new regulator for the sector, Ofcom has published a set of draft codes in recent months, setting out how platforms should deal with different types of content, before the new rules come into full effect, which is expected to happen in the future. At the end of this year.

The latest codes include more than forty practical measures that Ofcom says will require step-change from technology companies by enforcing safer design and operating practices from the largest sites.

In particular, the codes expect services to implement robust age verification processes to prevent children from accessing harmful material, and to ensure that their recommendation algorithms – such as ‘For You’ pages – do not serve dangerous or potentially harmful content to children.

According to the proposals, platforms that children have access to and are at greater risk of harmful content appearing should configure their algorithms to filter the most harmful content from children’s feeds and reduce the visibility and prominence of other content with lower risk but still potentially harmful , to decrease. , material.

The draft codes also require companies to have content moderation systems and processes in place, and to ensure swift action is taken against harmful content, with search engines expected to have a “safe search” option for use by children.

Ofcom chief executive Dame Melanie Dawes said: “We want children to enjoy life online. For too long, their experiences have been ruined by seriously harmful content that they cannot avoid or control. Many parents share feelings of frustration and concern about how to protect their children. That has to change.

“In line with new online safety laws, our proposed codes put the responsibility for keeping children safer squarely on technology companies. They will have to tame aggressive algorithms that push harmful content to children in their personalized feeds and implement age controls so that children get an age-appropriate experience.

“Our measures, which go far beyond current industry standards, will bring about a major change in the online safety of children in Britain. Once they are in place, we will not hesitate to use our full range of enforcement powers to hold platforms to account. That is a promise we make to children and parents today.”

Sir Peter Wanless, chief executive of children’s charity NSPCC, said the draft code was a “welcome step in the right direction” towards protecting children online.

“Building on the ambition in the Online Safety Act, the draft codes set appropriate, high standards and make clear that all technology companies will have work to do to meet Ofcom’s expectations to keep children safe,” he said.

“Tech companies will be legally required to ensure their platforms are fundamentally safe for children when the final code comes into effect, and we urge them to lead the way now and take immediate action to prevent inappropriate and malicious content is spread. shared with children and young people.

“Importantly, this draft code shows that both the Online Safety Act and effective regulation play a crucial role in ensuring children can safely access and explore the online world.

“We look forward to participating in Ofcom’s consultation and will share our expertise in safeguarding and child safety to ensure the voices and experiences of children and young people are central to decision-making and the final version of the code.”

Children’s online safety campaigner Ian Russell, father of 14-year-old Molly Russell who killed herself in November 2017 after seeing harmful material on social media, said more needs to be done to protect young people from online harm .

In his role as chairman of the Molly Rose Foundation, an online safety charity, Mr Russell said: “Ofcom’s job was to seize the moment and propose bold and decisive action that can protect children from widespread but inherently avoidable harm.

“The regulator has proposed a number of important and welcome measures, but the overall set of proposals must be more ambitious to prevent children from being exposed to harmful content that costs Molly’s life.

“It’s been over six years since Molly’s death, but the reality is that very little has changed. In some ways, the risks for teens have actually gotten worse.

“That’s why it’s hugely important that the next Prime Minister commits to getting the job done and strengthening the Online Safety Act to give children and families the protection they deserve.”

Technology Secretary Michelle Donelan said: “When we passed the Online Safety Act last year, we went further than almost any other country in our bid to make Britain the safest place to be online as a child.

“That task is a complex journey, but one we are committed to, and our groundbreaking laws will hold technology companies accountable in a way they have never experienced before.

“The government commissioned Ofcom to implement the law and today the regulator has been clear: platforms must introduce the kind of age checks that young people experience in the real world and tackle algorithms that too easily mean they’re encountering harmful material online.

“Once these measures are in place, they will fundamentally change the way children in Britain experience the online world.

“I want to reassure parents that protecting children is our first priority and that these laws will help keep their families safe.

“My message for platforms is: communicate with us and prepare. Don’t wait for enforcement and high fines, but take your responsibility and act now.”

Leave a Comment