This week, Ofcom will be publishing codes of practice to prevent children from accessing adult content on platforms such as X and Meta, which have emerged as a potential flashpoint for UK trade negotiations.
UK media regulators tell social media, search and gaming services that they need to remove adult content such as pornography or “age gate” content, or find other ways to protect their children from certain “legal and harmful” content.
The law was introduced in phases after its advance into the law in 2023 and is one of the biggest reforms in the way Britons access social media, including mainstream platforms such as Instagram, X and Facebook.
Melanie Dawes, CEO of Ofcom, told the Financial Times last year that she was facing “a major change” about how the industry operates.
In reality, the code means a wide range of changes to how the algorithm serves adult content, either removing the content completely or removing the tough new age checks and accessing sites and apps with suspended adult content under the age of 18.
Social media sites may need to use strict age verification tools, such as those requiring credit card details for the first time, or employ technology that includes estimating faces that remember their age.
Tech groups have many other ways to prevent children from viewing adult material, such as “clean” areas, and even remove common porn on social media sites with age restrictions under the age of 18.
Alongside porn, under the age of 18 should stop encountering suicide, self-harm, or eating disorders, and should be protected from misogyny, violence, hatred or abusive material, Ofcom says.
The code suggests practical steps the platform can take to meet its duties, such as configuring algorithms to remove harmful content from children’s social media feeds and internet searches.
Until last week, tech companies had to carry out so-called child access assessments to establish whether their services (part of the services) are more likely to be accessed by children. Facebook, Instagram, Snap, X and Tiktok all allow 13-year-old users.
Until the end of July, the High Tech Group will complete individual assessments of the risks the services pose to children and begin applying measures to mitigate the risk. Companies violating the ACT are facing fines of up to £18 million or 10% of global revenue.
Ofcom will begin additional consultations on further measures, including the use of artificial intelligence to tackle illegal content and hash matching to prevent the sharing of unconsensual intimate images and terrorist content.
Hash matching or hash scan compares certain content, such as videos, photos, and text, with a database of illegal content.
Watchdog will also propose crisis response protocols for emergencies such as the riots last summer.
Some online safety laws have already been enacted, such as quickly deleting illegal material, such as ordering from social media companies, search engines, messaging apps, and more, reducing the risk of content.
Recommended
However, online safety campaigners are concerned that the US will require the law to be filled as part of its trade negotiations with the US given the new imposition of US-based social media sites.
US officials asked about the law at a meeting with Ofcom last month, but Vice President JD Vance caused a free speech violation related to American tech companies when British Prime Minister Keir Starmer visited the White House in February.
“We can’t imagine a scenario in which Kiel Starmer’s government provides child safety for a trade contract, because doing so is not suitable to serve it,” said Barones Beavan Kidron, a crossbench peer in the British lord and digital rights campaigner.
SNAP said it “is continuing to work together on implementation with OFCOM in support of the goals of the Online Safety Act.”
Meta said that all UK teenagers using platforms including Instagram and Facebook have been moved to new “teen accounts” to comply with the new regulations, but those aged 17 and 18 can override the restrictions.
X said it “is taking all necessary steps to ensure compliance with UK law,” but Tiktok said it also complies with the clause.