Authors
In an open letter published on 12 March 2026, the data protection regulator, the Information Commissioner’s Office (ICO), called on online platforms to strengthen age assurance measures so that children cannot easily access services that are not designed for their age group. The letter follows the ICO’s recent enforcement action against US discussion forum Reddit, and is part of a broader shift towards closer scrutiny of how platforms identify and protect child users.
Key takeaways from the ICO’s open letter
- Self-declared age verification methods are no longer adequate where services are likely to be accessed by children.
- Platforms should use modern, viable age assurance technologies to enforce their own minimum age requirements and reduce instances of bypassing.
Background to the ICO’s letter
The letter follows the ICO’s £14.7 million fine issued to Reddit in February 2026 for failures surrounding the protection of children’s privacy, including a lack of robust age assurance and risk assessments.
Many online platforms make clear that their services are not intended for under-18s, but in practice do not implement meaningful controls capable of ensuring that underage users cannot access those services. The ICO’s guidance set out in its open letter, reflects a widening gap between policy (terms of use) and reality (actual access), and an expectation from the regulator that organisations close that gap with effective design and governance.
The ICO has made it clear that it will use its enforcement powers where platforms fail to implement effective age assurance and as a result, process children’s personal information unlawfully and potentially expose children to inappropriate or harmful content.
The ICO’s expectations
The open letter states that platforms with a minimum age requirement must not rely on children to self-declare their age, because this approach is too easy to bypass. Instead, the ICO expects online services to use viable, modern age assurance technologies that are now readily available, and to demonstrate that their approach is effective.
For many organisations, this will mean taking a more deliberate approach to product design by, for example, clarifying their intended audience, identifying where children are likely to access their services, and selecting proportionate controls to enforce minimum age thresholds.
The ICO has already written to major platforms like TikTok, Instagram, and Facebook, asking them to demonstrate compliance with robust age-assurance mechanisms and children’s data protection requirements. Where protection is not sufficient, the ICO will take enforcement action, including investigations and imposing financial penalties where appropriate.
Further to preventing access by children who are below a platform’s minimum age, the ICO emphasises the need to safeguard children who are old enough to use a service. This includes ensuring that settings, features and data practices are designed with children’s best interests in mind, and that protections are applied where a user is identified as a child.
How this fits into the ICO’s Children’s Code Strategy
The Children’s Code (also known as the Age Appropriate Design Code) is a statutory code of practice issued by the ICO under the Data Protection Act 2018. It sets out how online services likely to be accessed by children under 18 should design their services to protect children’s personal data and comply with the UK GDPR.
The ICO says its open letter forms part of the next phase of its Children’s Code strategy: shifting towards stronger scrutiny and enforcement of age assurance. In practice, that means moving platforms away from self-declared ages and towards age-checking mechanisms that are robust, proportionate and demonstrable.
The ICO’s aim is that platforms should be able to identify which users are children, so those users can benefit from the protections they are entitled to. For example, high privacy settings by default, data minimisation, and limits on profiling where appropriate.
The open letter has also highlighted ongoing concerns about how social media and video-sharing platforms process children’s personal data to produce recommendations, particularly where algorithms promote harmful content. Age assurance is relevant here because platforms need to know when child-specific safeguards should apply.
Coordinated regulation from the ICO and Ofcom
In order to fulfil the aim of increasing the protection of children online, the ICO continues to work closely with Ofcom, which enforces the Online Safety Act 2023. On 25 March 2026, both regulators published a joint statement on age assurance, setting out how organisations can navigate the interaction between online safety obligations and data protection requirements when deploying age assurance.
What should organisations do now?
- Map your user base and child-access risks: identify whether children are likely to access the service in practice (even if not intended), and where the highest-risk journeys sit (sign-up, content access, messaging, recommendations).
- Review minimum age rules and enforcement: if you set a minimum age, ensure you have an effective age gate and anti-circumvention controls. Tick-boxes and date-of-birth entry alone are unlikely to meet expectations.
- Select proportionate age assurance: consider which technical measures are viable for your context and risk profile, and ensure they are privacy-friendly and data-minimising.
- Check lawful basis and children’s personal data requirements: where under-13s may access your services, assess whether you have a lawful basis for the processing of the personal data concerned and what additional safeguards are required.
- Complete a Data Protection Impact Assessment (DPIA) to document potential risks to children and mitigations and ensure your DPIA reflects the reality of how the service is accessed.
- Be transparent: provide clear, accessible privacy information explaining what age assurance you use, what personal data is processed, retention periods, and how users can exercise their rights under data protection legislation.
- Scrutinise recommender systems: assess how children’s personal data is used for profiling and recommendations, and whether this could lead to harmful content exposure or excessive engagement patterns.
- Build evidence and governance: be ready to demonstrate decisions, testing, monitoring and continuous improvement to regulators.
Print article