In February 2026, the UK Information Commissioner’s Office imposed a £14.47 million fine on Reddit for unlawfully processing the personal data of children who accessed the platform. The penalty follows a pattern of escalating ICO enforcement against major platforms — TikTok (£12.7 million), Snap (£1.95 million), and now Reddit — for failing to prevent children from accessing their services and being subject to data profiling.
The Reddit case sharpens a question that every platform serving a mixed-age audience must now address: what does adequate age assurance look like, and is self-declaration — asking users to confirm they are above a minimum age — sufficient?
The ICO’s answer in the Reddit decision is no.
What Reddit Did and What the ICO Found
Reddit is among the most visited websites globally, hosting discussion communities across virtually every topic. Reddit’s terms of service require users to be at least 13 years old to create an account. The platform’s age assurance mechanism: users self-declare their age at account creation.
The ICO’s investigation found that Reddit’s age self-declaration model was inadequate to prevent children — including children under 13 — from accessing the platform, creating accounts, and being subject to Reddit’s data processing practices, including the profiling used to serve personalized content and advertising.
The ICO’s core findings:
Self-declaration is not age assurance. A user who inputs a false date of birth to clear a minimum age gate has not provided verified age information — they have provided self-reported information that Reddit treated as verified. The ICO’s position, consistent across its enforcement actions in this area, is that “age assurance” requires a mechanism that provides reasonable confidence that the user is actually the age claimed, not merely that they have claimed an age.
Children were being profiled without lawful basis. Reddit’s data processing for personalized content and advertising is based on profiling — building a picture of user preferences and interests over time. Under UK GDPR and the Children’s Code (the ICO’s Age Appropriate Design Code), profiling children requires a lawful basis that self-declaration cannot establish, because the consent of a child under 13 has no legal effect and the consent of a child between 13 and 17 is subject to heightened standards.
The Children’s Code creates specific obligations for likely child users. The ICO’s Age Appropriate Design Code, operative since September 2021, requires that platforms likely to be accessed by children apply specific data protection standards by default — regardless of whether the platform is designed for children or specifically targeted at them. Reddit, as a platform with content categories likely to attract under-18 users, was within scope.
The ICO’s Enforcement Pattern on Children’s Data
The Reddit fine is the latest in a consistent ICO enforcement pattern. Understanding the series of cases helps clarify what the enforcement standard actually requires.
TikTok — £12.7 million (2023). The ICO fined TikTok for multiple violations, including failing to obtain valid parental consent for children under 13, processing children’s data without a lawful basis, and using children’s data in ways that were not transparent. The TikTok case established that platforms with significant child user populations face direct enforcement exposure even when children gain access despite the platform’s terms of service.
Snap — £1.95 million (2025). The ICO fined Snap for failing to properly assess the impact of a product feature (the Snap Map) on the privacy of child users. The Snap case extended the Children’s Code to product feature decisions — not just data collection practices — and established that a Data Protection Impact Assessment that failed to adequately consider child safety impacts did not satisfy the Code’s requirements.
Reddit — £14.47 million (2026). The Reddit decision extends the enforcement pattern in two ways: first, by applying substantial fines to a platform that is not primarily a children’s service but that children use; and second, by making explicit that age self-declaration does not satisfy the age assurance requirement.
The escalating fine amounts reflect the ICO’s assessment of organizational culpability and the scale of the processing involved. Reddit’s global user base — and the volume of data processed — warranted a penalty in the same order of magnitude as TikTok.
What the Children’s Code Requires
The ICO’s Age Appropriate Design Code (the Children’s Code) sets out 15 standards that apply to online services “likely to be accessed by children.” The relevant standards for the Reddit case:
Age appropriate application. Services must apply the Code’s protections to all users who are children or who the service cannot verify are not children. This means: if your platform cannot verify user age, you must treat all users as potentially being children for compliance purposes — or implement adequate age assurance to verify they are not.
Default settings. The highest privacy settings must be on by default for child users. Profiling, behavioral advertising, and content personalization that uses personal data must be off by default unless a child or their parent explicitly enables it.
Profiling and data minimization. Profiling of children is prohibited unless there is a clear reason why it is in the best interests of the child and accompanied by appropriate safeguards.
Transparency and geolocation. Clear and age-appropriate privacy information must be provided. Precise geolocation services must be off by default.
What “Adequate Age Assurance” Now Requires
The Reddit decision, combined with the broader ICO enforcement record, has effectively set a minimum standard for age assurance on platforms with mixed or undetermined user age profiles.
Age self-declaration — a date of birth field that users can fill in with any value — does not satisfy the standard.
What the ICO’s guidance and enforcement record indicate is required:
For platforms designed for or heavily used by children: Robust age verification — matching declared age against independent databases, document verification, or similar mechanisms — is effectively required.
For general audience platforms with content likely to attract under-18 users: At minimum, age estimation technology (AI-based inference from behavioral signals, device characteristics, or contextual data) or other assurance mechanisms that provide reasonable confidence in user age, combined with default-off data processing for users the platform cannot verify as adults.
For all platforms: A documented assessment of whether the platform is “likely to be accessed by children” — and if yes, a documented plan for how the Children’s Code standards apply and are implemented.
The UK Online Safety Act, now in force, adds a parallel layer: platforms must conduct and document Children’s Access Assessments that evaluate whether children are likely to access the service, and must implement the Code’s protections if the assessment so indicates. Ofcom began rolling out updated guidance on Children’s Access Assessments in May 2026.
The UK Data (Use and Access) Act: New Enforcement Powers
The Reddit fine was issued under existing UK GDPR enforcement authority. But the regulatory landscape changed further on February 5, 2026, when the Data (Use and Access) Act 2025 (DUAA) commenced.
The DUAA expanded the ICO’s enforcement powers in several areas relevant to children’s data:
-
Raised PECR fine ceiling to UK GDPR levels. The Privacy and Electronic Communications Regulations, which govern cookie consent and electronic marketing, previously had a much lower fine ceiling than UK GDPR. The DUAA aligned PECR fines with UK GDPR’s maximum (£17.5 million or 4% of global annual turnover). Platforms that relied on PECR’s lower penalties as a de facto compliance ceiling have seen that ceiling disappear.
-
New ICO investigatory powers. The DUAA gives the ICO expanded authority to require documents, interview personnel, and conduct technical inspections.
The combined effect: platforms operating in the UK now face a more powerful regulator with more tools and higher fine ceilings across a wider range of data protection and privacy obligations.
Implications for Platforms Outside the UK
The ICO’s enforcement authority applies to processing of UK residents’ personal data regardless of where the platform is incorporated. Reddit is a U.S. company; the ICO fine reaches it because it processes personal data of UK users.
GDPR and EU member state DPAs apply equivalent standards under Article 8 GDPR (children’s consent) and national implementations of the GDPR. Germany, Ireland, and other EU member states have their own children’s code equivalents or enforcement priorities in this area.
In the United States, COPPA (the Children’s Online Privacy Protection Act) governs data collection from children under 13, with FTC enforcement authority. The FTC’s recent actions — including the 2026 statement by the Chairman that protecting children online is “one of the most important consumer protection issues of our time” — signal parallel enforcement attention.
The practical implication for U.S. platforms serving or likely to serve UK and EU users: the ICO’s Children’s Code requirements are operative, the enforcement is real, and self-declaration age gating does not provide adequate protection against regulatory action.
What Platform Operators Must Do
The Reddit enforcement action, read alongside the TikTok and Snap cases, creates a clear compliance mandate for platforms accessible to children:
Conduct a Children’s Access Assessment. Formally evaluate whether your platform is likely to be accessed by children. Document the assessment. If the answer is yes or uncertain, apply the Children’s Code protections.
Implement age assurance that goes beyond self-declaration. Options include age estimation technology, verification against identity databases, device-level parental controls integration, or other mechanisms that provide reasonable confidence in user age. Self-declaration alone will not satisfy the ICO’s standard.
Default all data processing to off for unverified users. Profiling, behavioral advertising, and personalized content based on data processing should be off by default for users the platform cannot verify as adults.
Audit your Data Protection Impact Assessment for child-specific risks. If your DPIA does not specifically address the risk of child users accessing your service and the data processing implications, it does not satisfy the Children’s Code requirements as interpreted in the Snap and Reddit enforcement actions.
Review cookie and advertising technology deployment for PECR compliance. With PECR fines now aligned with UK GDPR maximums, cookie compliance failures that were previously low-risk financial exposures are now material enforcement risks.
The Reddit fine is not the end of ICO enforcement on children’s data — it is another data point in a consistent and escalating enforcement series. The ICO has demonstrated across multiple cases that it will pursue major platforms, impose substantial fines, and require operational changes. The standard it is enforcing — age assurance through mechanisms more robust than self-declaration — is now clearly established.
Platforms that have not yet addressed age assurance on their UK and EU properties are operating with known enforcement exposure. The question is when, not whether.
Sources: ICO Enforcement Action (Reddit, February 2026); Osborne Clarke (UK ICO fines online platform £14.47m); ICO Enforcement Page; Bratby Law (DUAA ICO Enforcement 2026); ICO Children’s Code; UK Online Safety Act 2023; Ofcom Children’s Access Assessment Guidance (May 2026); ICO TikTok enforcement decision (2023); ICO Snap enforcement decision (2025). This article is provided for informational purposes only and does not constitute legal advice.



