The government’s long-awaited Online Safety Bill is finally set to be introduced to Parliament today.
MPs are expected to approve the safety laws, which aim to protect children from harmful content such as pornography and limit people’s exposure to illegal content.
It will require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions.
Independent regulator Ofcom will have the power to fine companies failing to comply with the laws up to 10% of their annual global turnover, force them to improve their practices and block non-compliant sites.
The government says executives whose companies fail to cooperate with Ofcom’s information requests could now face prosecution or jail time within two months of the Bill becoming law, instead of two years as it was previously drafted.
A raft of other new offences have also been added to the Bill to make in-scope companies’ senior managers criminally liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom, and for obstructing the regulator when it enters company offices.
“The internet has transformed our lives for the better. It’s connected us and empowered us. But on the other side, tech firms haven’t been held to account when harm, abuse and criminal behaviour have run riot on their platforms. Instead they have been left to mark their own homework,” said Digital Secretary Nadine Dorries.
“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving. Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age.
“If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.”
The government claims the Bill is balanced and proportionate with exemptions for low-risk tech and non-tech businesses with an online presence.
It also claims to strengthen people’s rights to express themselves freely online and ensure social media companies are not removing legal free speech, with users gaining the right to appeal if they feel their post has been taken down unfairly.
It will also put requirements on social media firms to protect journalism and democratic political debate on their platforms. News content will be completely exempt from any regulation under the Bill.
Social media platforms will also only be required to tackle ‘legal but harmful’ content, such as exposure to self-harm, harassment and eating disorders, set by the government and approved by Parliament.
Previously they would have had to consider whether additional content on their sites met the definition of legal but harmful material. This change removes any incentives or pressure for platforms to over-remove legal content or controversial comments and will clear up the grey area around what constitutes legal but harmful.
Ministers will also continue to consider how to ensure platforms do not remove content from recognised media outlets.
Changes to the original draft bill in May 2021 include bringing paid-for scam adverts on social media and search engines into scope; ensuring all websites which publish or host pornography put robust checks in place to ensure users are 18 years old or over; and adding new measures to clamp down on anonymous trolls to give people more control over who can contact them and what they see online; and criminalising ‘cyberflashing’ – the sending of obscene images to strangers online.
“Today marks an important step towards creating a safer life online for the UK’s children and adults. Our research shows the need for rules that protect users from serious harm, but which also value the great things about being online, including freedom of expression. We’re looking forward to starting the job,” said Dame Melanie Dawes, Ofcom chief executive.