2025 was the year of online safety laws – but do they work?
The way young people experience the internet is changing Linda Raymond/Getty Images Several countries around the world introduced new
The way young people experience the internet is changing
Linda Raymond/Getty Images
Several countries around the world introduced new restrictions on internet access in 2025 to protect children from viewing harmful content, and others seem intent to follow suit in 2026. But do these measures really protect children or simply inconvenience adults?
The UK’s Online Safety Act (OSA) came into force on 25 July and forced websites to block children from seeing pornography and content that encourages self-harm, depicts violence or encourages dangerous stunts. The legislation has attracted criticism over the wide range of “harmful content” it covers, and it eventually caused a flurry of small websites to shut down as the owners saw no way to comply with the heavy regulatory burden it brought.
Meanwhile, Australia is bringing in a ban on social media for those aged under 16, even if their parents approve of its use. The Online Safety Amendment (Social Media Minimum Age) Act 2024 took effect this month and gave regulators the power to fine companies that failed to stop children using their platforms up to AUS$50 million. The European Union is debating a similar ban on access for children and France introduced laws requiring age-verification for websites containing pornographic content, sparking protests from companies that operate adult websites.
There are certainly signs that such legislation has teeth. UK regulator Ofcom has fined AVS Group, which runs 18 porn websites, £1 million for failing to take adequate steps to prevent access by children, while other companies have been “told to do more work” on safety measures. But it is technology where these new laws fall down.
Facial-recognition technology designed to check ages can be fooled by using screenshots of video game characters, and VPNs make it trivial to appear to websites as a user from another country where age checks aren’t mandated. Worryingly for legislators, web searches for VPNs soared in the hours following the OSA coming into force and companies reported daily sign-ups increasing by up to 1800 per cent. So news that the largest porn website saw a 77 per cent drop in visits from the UK in the wake of the OSA should perhaps be taken with a pinch of salt – users may just be changing their settings to appear as if they are coming from countries where age checks aren’t necessary.
The Children’s Commissioner for England has said that this loophole needs closing and has suggested age verification to stop children using VPNs. But that smacks of chasing the problem in circles rather than clamping down on it at source – so what should we be doing?
Andrew Kaung, who previously worked in the safety and moderation teams at both Meta and TikTok, says he doesn’t believe harmful content is shown to children deliberately, but inadvertently because algorithms learn that it keeps attention longer and drives more engagement, therefore generating more advertising revenue. This makes him sceptical that technology companies will really strive to protect children, as doing so is likely to harm their bottom line.
“It’s very hard to to imagine that they’re going to enforce [any new legislation] themselves when their interest and the public interest is kind of against each other. Profit is still king,” says Kuang. “They will do the bare minimum in terms of compliance.”
Graham Murdock at Loughborough University, UK, says that regulation will always lag behind the fast pace of technology companies, so the flurry of new online safety laws are likely to disappoint. Instead, he would like to see the creation of state-run internet services, with search engines and social media platforms operated on a public charter along the lines of the BBC.
“The internet is a public service. It offers all sorts of incredibly valuable capacities for people in their everyday life, so we have to think of it like a public utility,” says Murdock. “I think we’re at a sort of hinge point. If we don’t do something fairly serious now, then I think it will be beyond beyond retrieval.”
Topics:
