The Online Safety Act: Protecting Children Without Compromising Privacy
The UK’s Online Safety Act (OSA), taking full effect from July 2025, is one of the most far-reaching online regulation the country has ever passed. Its goal is clear: to make the UK one of the safest places in the world to use the internet, especially for children. Platforms will be legally responsible for limiting exposure to harmful content, enforcing age restrictions, and removing illegal material more quickly.
But safety is only half the story. For privacy professionals, the Act raises serious questions about how these safety controls will be implemented. It introduces mechanisms that could intrude on personal privacy, especially for children. That creates a tension that must be resolved carefully. We should not be forced to choose between child protection and handling data responsibly.
This is not a theoretical discussion. Our organisation has been shortlisted for the 2025 Privacy Awards Europe for our work on safeguarding children’s data. That recognition reflects a growing demand for solutions that respect both safety and privacy. The Online Safety Act puts that demand to the test.
A double-edged sword for privacy
One of the most debated aspects of the Online Safety Act is its provision allowing Ofcom, under certain conditions, to require platforms to deploy accredited technology capable of detecting child sexual abuse material in private messages, including on end-to-end encrypted services. The intention is not to weaken encryption across the board, but to address a specific and serious harm. Most people would agree on the importance of tackling child sexual abuse online; the challenge lies in how that goal is pursued.
The Act does not mandate routine monitoring of private communications. However, the possibility of compelled scanning - even in limited cases - raises concerns about the long-term implications for privacy and secure communication. If detection mechanisms are built into messaging services, it risks setting a precedent where encrypted platforms are expected to inspect content before it reaches the recipient, undermining the very purpose of end-to-end encryption. This concern is particularly acute for children. While the intent is to protect them, automated systems designed to identify harmful content can overreach, leading to unnecessary data collection, profiling, or false positives. These unintended consequences can erode children's digital dignity and autonomy at a time when they are learning how to engage with the world online.
Age assurance: practical, but privacy-invasive?
Another key part of the Act is the requirement for platforms to assess whether children are accessing potential inappropriate content. To comply, companies are turning to age assurance and verification systems. These may range from simple self-declaration to biometric checks, ID uploads, and behavioural analysis.
Without strong safeguards, these systems can become privacy liabilities. If age verification requires uploading a passport or submitting a face scan, that data must be handled with extreme care. Too often, the emphasis is on compliance rather than minimisation. Platforms collect more than they need, store it longer than necessary, and fail to explain clearly what happens to it.
The Act offers little help here. It requires platforms to take “proportionate” steps but leaves the interpretation of that term open. This makes it even more important for privacy professionals to lead in defining standards for transparency, necessity, and security..
Safeguarding children’s data: where privacy and protection meet
Our recent work, recognised by the Privacy Awards Europe, shows that it is possible to design systems that respect both privacy and safety. We have supported schools by embedding privacy practices directly into their safeguarding culture. As shown in our education case study, we provide hands-on DPO support, scenario-based training, and expert help with data sharing, breaches and regulator contact.
This approach avoids the trap of template compliance. Instead, it builds confidence across staff, parents and leadership. Privacy becomes part of daily safeguarding, not an afterthought. That’s the balance we believe the Online Safety Act should be aiming for.
What privacy professionals should do now
With the Act taking effect in July, there are practical steps organisations should be taking:
- Review your data flows, especially for services likely to be accessed by children. Understand where sensitive data might be collected in the name of safety.
- Conduct or update DPIAs with a specific focus on children. Use the ICO’s Children’s Code as a guide and look at Ofcom’s risk profiles for further context.
- Explore privacy-preserving age assurance tools. These include third-party solutions that verify age without exposing personal details to the platform itself.
- Engage with Ofcom’s upcoming codes and consultations. The regulator has said it will work closely with the ICO, so there is a strong case to influence guidance before it becomes fixed.
Conclusion: child safety should never come at the cost of dignity
The Online Safety Act is a bold piece of legislation. It addresses very real risks to children online, and it sets a higher bar for accountability. But good intentions do not guarantee good outcomes. If implemented carelessly, the tools meant to protect children could instead erode their right to privacy.
Our job is to make sure that does not happen. As professionals who understand both the technical and human side of data protection, we must help shape how this law is applied in practice. That means offering alternatives, asking better questions, and never losing sight of the fact that children are not just users or data points. They are individuals who deserve safety and privacy in equal measure.