36

Digital childhoods: Why age matters in data protection

Privacy Culture | July 22, 2025

Most companies say they don’t handle children’s data. Then someone checks the user base. There’s a birthday field, a free app used by schools, a product that quietly became popular with teenagers. It doesn’t take much. Before long, you're processing data from under-18s without meaning to.

And when you are, everything changes.

Age isn’t a checkbox it’s a compliance trigger

Under UK GDPR, children merit special protection. The ICO’s Children’s Code sets out 15 specific standards. It focuses on how services are designed, how data is handled, and what kind of profiling is permitted. But the reality is, most organisations don’t realise they’re within scope.

The test is simple. Is it likely that children use your service? Do you market to them? If the answer is yes, even unintentionally, you’re expected to design for their rights.

The most common mistake we see in our Privacy Operations Centre is the assumption that a birthday field at sign-up is enough. Users enter 1993 and move on. There’s no check, no logic that changes based on age, no way to handle false inputs.

That creates risk because if your system is treating 13-year-olds and 33-year-olds the same, you’re not protecting anyone.

Profiling, defaults, and real-world exposure

Many platforms profile users by default. Recommendation engines, engagement prompts, even marketing nudges. When children are part of your user base, these systems need to behave differently. The Children’s Code is clear, no profiling without a compelling reason and extra safeguards.

The same goes for geolocation, push notifications, and data retention. If your default settings are built around adult behaviour, you’re missing the mark.

We’ve worked with organisations where child users made up over 30% of the user base - and no one in product or UX had ever seen the risk flagged. That’s not bad intent. It’s just what happens when legal owns the policy, but product owns the flow.

This is where we often support clients by creating internal playbooks. Short, team-specific guides that outline how to recognise child data, what to do when it’s found, and how to flag design changes early. It’s not about adding friction. It’s about baking responsibility into the process.

How to shift culture without slowing down

You don’t need to overhaul your platform to protect children’s data. But you do need better signals inside the business. Start by reviewing your onboarding flow. Can someone under 18 realistically sign up? If yes, what happens next?

Then map your user base. You might not collect age directly, but analytics and usage patterns will give you clues. Once you know children are part of your audience, create a plan.

At our POC, we’ve seen success where companies adopt a “trigger model” where product launches, design updates, or new features automatically prompt a privacy check. This works far better than relying on one-off legal reviews.

Training needs to extend beyond compliance teams. Your developers, designers, and customer service staff should all know when age matters and what it changes. It’s not about burdening everyone with legal knowledge. It’s about giving people confidence to ask the right questions.

Protecting children’s data isn’t just regulatory hygiene. It’s public trust. And it’s often where the next reputational risk lies. You can’t spot every edge case, but you can build a system where people know what to watch for.

Related Articles

Loading...