Unlock Editor’s Digest Lock for Free
FT editor Roula Khalaf will select your favorite stories in this weekly newsletter.
Social media groups, search engines and messaging apps will be told next week to quickly remove illegal material and introduce strict measures to reduce the risk of such content under the new rules of the UK Media Watchdog.
On Monday, Ofcom will begin implementing new rules designed to protect internet users from illegal content and against harmful activities online. Regulators and lawmakers want additional power to curb the rise of the kind of extreme false information that caused violent unrest last summer following the massive stab wounds in Southport.
Under the UK’s Online Safety Act (OSA), tech companies had to complete a mandatory illegal content risk assessment by the end of this weekend to understand how likely it is for users to encounter illegal content on their services.
For “user to user” messaging services, this includes methods used to commit or promote criminal offences.
The so-called illegal content of priority covers 17 categories, ranging from terrorism, child sexual abuse, suicide encouragement or support, stalking and drugs, crime and fraud charges.
Starting next week, Ofcom will begin assessments of compliance with a new illegal hazard obligation under the OSA, and will begin enforcement action with issues and failure to comply. The OSA was passed by Congress in 2023, but is being implemented this year and next year.
Sites and apps should begin implementing safety measures to mitigate risk. Senior executives are named as staff members responsible for improved compliance and moderation, ease of reporting, and built-in safety testing.
Under the new rules, tech companies should ensure that their moderation teams are resources and training, set performance goals and set up rapid removal of illegal materials. Platforms need to test their algorithms to make it difficult to spread illegal content.
Ofcom will first prioritize large sites and apps that may present a specific risk of harm from illegal content, for example because of the large number of users in the UK, or because of the risk that users will encounter some of the most harmful forms of online content and conduct, due to their scale and nature.
Recommended
Suzanne Cater, Executive Director of Ofcom, said: But, we can definitely expect providers who have not been able to implement the necessary protections to face the full power of our enforcement action. ”
British law firm Linklaters described the new rules as “the deadline for the first major regulations” under the OSA. Companies may be fined up to £18 million or 10% of eligible global revenue.
“We’re committed to providing a range of services to our customers,” said Ben Packer, a partner at LinkLaters. I think there may be some companies that are not doing much at all. ”
Packer added that the threat of Ofcom’s intervention could outweigh the financial levies, noting that regulators could “mandate additional steps to moderate content, reporting users, or technology deployed to detect content.”
“Our experience in other sectors is such interventions in corporate operations that tend to be more influential,” he said.