Ofcom has said that children in the UK will have safer online lives after its ‘transformational new protections’ were finalised today.
The communications regulator is laying down more than 40 practical measures for tech firms to meet their duties under the Online Safety Act.
These will apply to sites and apps used by UK children in areas such as social media, search and gaming, and follows consultation and research involving tens of thousands of children, parents, companies and experts.
The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography.
“These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content,” said Dame Melanie Dawes, Ofcom chief executive.
“Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
The Codes of Practice demand a ‘safety-first’ approach in how tech firms design and operate their services in the UK.
The measures include safer feeds, effective age checks, fast action, strong governance, easier reporting and complaints and more choice and support for children.
Natalia Greene, online safety expert at PA Consulting, added: “Ofcom’s ‘Protection of Children Codes of Practice’ marks a significant step forward, towards a safer internet for our children and young people.
“This will help to protect children against material that promotes suicide and self-harm, misogyny, and eating disorders, and other abusive, violent, and hateful content.
“In the weeks and months ahead, young people should see their social media ‘feeds’ transformed as more harmful content is filtered out by improved ‘recommender’ systems.
“Behind the scenes, changes to how services are governed should improve overall accountability for child safety. Platforms will be required to name a senior individual accountable for compliance and establish a monitoring function to independently assure the effectiveness of safety measures.
“Every service provider in regulatory scope now has until 24 July to assess the risks posed to child users of their service.”
If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.
The regulator has added that it will build on its new codes in the coming months, on additional measures to protect users from illegal material and harms to children.
Listed SkinBioTherapeutics makes two leadership appointments