36

How the Online Safety Act unfolds: child protection, privacy trade‑offs and VPN loopholes.

Privacy Culture | August 6, 2025

Since 25 July 2025, services that publish or allow users to encounter pornographic content are required to implement highly effective age verification measures to prevent children from accessing such content. This includes commercial pornography providers and other regulated services that host user-generated content. That means age verification is no longer optional. Platforms must implement highly effective checks, which could include facial scans, ID uploads, or credit card verification. ‑ Noncompliant services may face fines up to £18 ‑million or blocking orders.

Where context enters the picture:

Not every platform hosting user-generated content is subject to age verification, but confusion over the scope of the new rules has led some services to go beyond what’s legally required. Faced with uncertainty or wary of regulatory scrutiny, some smaller communities have reportedly considered blocking UK access due to the compliance burden associated with age assurance systems, echoing similar trends seen under GDPR and earlier UK proposals.

At the same time, a few larger platforms have introduced age checks for relatively low-risk content, even where the legal basis is unclear. While the Act imposes a broad set of duties, including content moderation, risk assessments, and transparency obligations, age verification is specifically required in limited circumstances, such as access to pornographic content or certain other services that are likely to be accessed by children. But in practice, the fear of misinterpreting the law appears to be prompting a cautious, sometimes excessive, approach across the digital landscape.

When companies collect more data than necessary:

Although the Act introduces duties around age assurance, it does not prescribe detailed requirements for how age verification providers must handle personal data. Privacy critics report that unnecessary biometric collection continues even for low-risk services, turning verification into a data-rich surveillance channel for platforms with weak oversight. The lack of explicit requirements for data minimisation means raw ID documents and scans may be stored when they are not truly needed (Electronic Frontier Foundation).).

How evasion undermines enforcement:

 Since the rollout of age verification rules, VPN usage in the UK has surged. According to WIRED, downloads of major VPN apps have increased dramatically, as users seek to bypass geographic restrictions and avoid age checks altogether. While VPNs are not illegal, PC Gamer reports that Ofcom has warned platforms against promoting their use to circumvent compliance measures. Verification systems have also shown vulnerabilities. Authentication systems have also been fooled: some users reportedly passed face scan checks using game screenshots or AI-generated faces (The Verge). And facial estimation tools are known to misclassify people from minority groups more often, meaning accuracy errors may disproportionately block certain users (Raconteur).

What happens in practice:

  • Around 5 million additional age verifications are now happening every day in the UK for adult sites (The Guardian).
  • But public doubt is high. While nearly 70% support the intent of the law, only about 24 per cent believe it will truly prevent under‑18s from accessing restricted content (YouGov).
  • Concerns are also growing about unintended consequences. Many community forums, including mental health or sexual trauma support groups, are now blocked for non-verified users, potentially deterring help-seeking youth (The Verge).

What could be done differently in actual rollout

These are not ideals but practical shifts that could happen and already partly are within the Act’s framework:

  1. Age checks would apply strictly to high-risk services, not general-purpose sites. Some platforms might already limit verification only to sections tagged as adult or harmful, rather than entire services.
  2. Token-based credentials may be adopted: rather than storing biometric data, a user verifies once and receives a token to access multiple compliant services—thus minimising exposure.
  3. Regulate verification providers: Age assurance firms, especially those operating outside the UK, could be subject to auditing requirements, including rules on data deletion, encryption standards, and limits on retention. Stronger oversight would help build trust in the ecosystem and ensure compliance with data protection principles.
  4. Low-risk or community services could offer anonymous or optional flows, preserving open access while fulfilling obligations for platforms handling priority content.
  5. Feedback loops may detect high evasion behaviour, prompting regulators to adjust enforcement focus or encourage simpler, less invasive verification tools.

 Conclusion: Navigating a complex rollout

The Online Safety Act marks a significant shift in the UK’s approach to regulating digital spaces, with the aim of creating a safer online environment for children. But its early implementation reveals the tension between ambition and practicality. While the introduction of age assurance for adult content is a notable first step, the boundaries of what is required remain blurred. This has led some platforms to adopt more extensive age checks than legally necessary, raising questions about proportionality, user access, and data handling.

At the same time, persistent workarounds like VPNs and spoofing, as well as concerns around privacy and inclusion, highlight the challenge of enforcement in a dynamic and decentralised online ecosystem. The Act has opened the door to stronger protections, but its long-term success will depend on how effectively regulators and platforms can balance safety with privacy, access with accountability, and legal clarity with technical feasibility.

The tools the law now enables could protect children but without tight scope, strong data governance, and contextual discretion, what happens now risks undermining both effectiveness and privacy.

Related Articles

Loading...