The Office of Communications (Ofcom) will be given additional powers to regulate social media firms in the UK regarding harmful content.
The UK government has decided to grant new powers to Ofcom, which currently regulates the media but is not responsible for internet safety, which includes social media content. Currently, social media firms such as Facebook, Tiktok, YouTube, Snapchat and Twitter have been largely self-regulating.
While these companies have defended their own rules about taking down unacceptable content, critics have called for independent rules to keep people safe.
There have been widespread clamor for more action regarding harmful content, particularly after the death of teenager Molly Russell who took her own life after viewing graphic content on Instagram.
The government will officially announce the new powers for Ofcom late Wednesday but it remains unclear what penalties Ofcom will be able to enforce to target violence, cyber-bullying and child abuse.
The additional powers will enable Ofcom to make tech firms responsible for protecting people from harmful content such as violence, terrorism, cyber-bullying and child abuse. Ofcom will be able to ensure that platforms remove harmful content quickly.
Digital Secretary Baroness Nicky Morgan said: “There are many platforms who ideally would not have wanted regulation, but I think that’s changing. I think they understand now that actually regulation is coming.”
The decision to grant Ofcom additional powers is the the government’s first response to the Online Harms consultation it carried out in 2019, which received 2,500 replies. The new Ofcom rules will be applicable to firms hosting user-generated content, including comments, forums and video-sharing.
While the government will set the direction of the policy, Ofcom will have the freedom to write and adapt the details. This will enable the regulator to address new online threats as they emerge without the need for further legislation.