Overview of the Enforcement Action
On March 10, 2026, Spain’s Agencia Española de Protección de Datos (AEPD) published a landmark enforcement resolution imposing a total fine of €950,000 on Yoti Ltd, a UK-based digital identity and age verification company. The resolution, filed under reference EXP202317887 and signed by AEPD President Lorenzo Cotino Hueso, identifies three distinct violations of the EU General Data Protection Regulation (GDPR) in the operation of Yoti’s Digital ID application.
The investigation was initiated by the AEPD’s Deputy Directorate for Data Inspection in December 2023 after the authority identified practices that “could constitute a possible infringement” of data protection law. It represents one of the most detailed public rulings to date on the specific obligations of age verification providers operating within the European Union.
Breakdown of the Fine
| Violation | GDPR Article | Fine Amount |
|---|---|---|
| Unlawful processing of biometric data | Article 9 (Special Category Data) | €500,000 |
| Invalid consent mechanisms | Article 7 (Conditions for Consent) | €200,000 |
| Excessive data retention | Article 5(1)(e) (Storage Limitation) | €250,000 |
| Total | €950,000 |
In addition to the financial penalties, the AEPD has ordered Yoti to implement corrective measures within six months of the resolution becoming final, requiring the company to demonstrate compliance across all three areas of violation.
What Is Yoti and How Does Its Digital ID App Work?
Yoti Ltd, registered in the United Kingdom (tax ID 08998951), is a digital identity company that provides age assurance and identity verification services to platform operators, businesses, and government agencies across multiple markets. The company’s most recent published revenue figures, cited in the AEPD resolution as of March 2025, stand at €15,029,907 since its inception — a figure the authority used as a reference point for calibrating proportionate penalties.
The company has been downloaded by over 14 million users globally and offers a range of age verification methods including:
- Facial age estimation from a selfie (using deep neural networks)
- Government-issued ID document scanning and verification
- Reusable digital ID credentials
- Credit card verification
- Mobile number matching
- Database checks against third-party records
- Electronic identity (eID) services used in Switzerland, Denmark, and Finland
- US mobile driving licence verification
The Digital ID App: Core of the Enforcement Action
The Digital ID app is the specific service targeted in the AEPD’s enforcement action. According to Yoti’s own documentation submitted during the investigation, the app works as follows:
- Account creation: Users upload a government-issued identity document and take a selfie
- Biometric processing: The app uses deep neural networks to process the facial image, breaking it into pixels treated as numerical values and running them through a network of mathematical nodes arranged in layers analogous to the human brain
- Template creation: A biometric template is generated from the facial scan and stored while the account remains active
- Ongoing verification: When users modify their PIN or recover their account, the app captures a new facial scan and compares it against the stored template in a 1:1 matching operation
- Age estimation: The typical processing pipeline produces an estimated age in approximately 1 to 1.5 seconds
The facial age estimation model was trained using 12 age range categories (0–1, 2–3, 4–6, 7–9, 10–12, 13–15, 16–17, 18–24, 25–29, 30–39, 40–49, 50–60), four gender groupings, and three skin tone groups based on the Fitzpatrick scale — producing 144 demographic combinations. According to Yoti’s white paper, the model achieves accuracy within 1.28 years.
For age token reuse, Yoti developed a cookie-based system where age tokens (browser cookies valid for 30 days) allow users who verify once to reuse that result across participating platforms.
Violation 1: Unlawful Processing of Biometric Data (Article 9 GDPR) — €500,000
The AEPD’s primary and largest finding concerns the processing of biometric data without a valid legal basis under Article 9 of the GDPR, which governs the handling of “special category” personal data.
The Legal Framework
Article 9(1) of the GDPR establishes a general prohibition on the processing of special category data, which explicitly includes “biometric data for the purpose of uniquely identifying a natural person.” Processing is only permitted where one of the specific exemptions listed in Article 9(2) applies — such as explicit consent (Article 9(2)(a)), substantial public interest (Article 9(2)(g)), or where processing is necessary for reasons of preventive or occupational medicine (Article 9(2)(h)).
Article 4(14) of the GDPR defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person.”
Yoti’s Argument: Authentication, Not Identification
Throughout the investigation, Yoti maintained that the facial scan it performs does not constitute biometric data of a special category because its purpose is not to uniquely identify users but merely to authenticate them. The company drew a distinction between identification (determining who someone is from a pool of individuals) and authentication (confirming that a person is who they claim to be through a 1:1 match).
The AEPD’s Rejection
The AEPD firmly rejected this argument. The authority found that data constitutes biometric special category data under Article 4(14) when three criteria are satisfied:
- It relates to the physical, physiological, or behavioural characteristics of a natural person
- It is intended to confirm unique identification
- It has been subjected to specific technical processing to generate, store, exploit, and destroy biometric templates derived from raw samples
The AEPD found all three criteria satisfied in Yoti’s case. The facial scan produces a biometric template that is stored for the duration of the account. When users perform account recovery or PIN changes, the app captures a new facial scan and runs a 1:1 comparison against the stored template — which the authority characterised as a unique identification operation.
The AEPD noted that “despite repeatedly asserting — both during account creation and in the privacy policy — that the purpose of processing the biometric facial pattern is to guarantee user identification, Yoti does not consider itself to be processing special category personal data,” a position the authority characterised as reflecting “particular negligence.”
Aggravating Factors
The AEPD identified several aggravating factors that increased the severity of this violation:
- Involvement of minors: An age verification application would, by its nature, be used extensively by users under 18 — a population requiring heightened data protection
- International data transfers: Biometric data is processed on servers outside the European Union. Yoti operates a Security Centre in India that provides manual verification support, relying on EU standard contractual clauses with a UK addendum for lawful transfers
- Scale of processing: The app has been downloaded millions of times and processes biometric data across multiple jurisdictions
Implications of This Finding
This ruling establishes an important precedent: 1:1 biometric matching (authentication) is treated identically to 1:N biometric identification for the purposes of Article 9 protection. Organizations that use selfie-matching for account verification, onboarding, or identity confirmation should take note — the “it’s just authentication” argument does not provide an exemption from special category data requirements.
Violation 2: Invalid Consent Mechanisms (Article 7 GDPR) — €200,000
The second violation concerns the conditions under which Yoti obtained consent from its users, found to be non-compliant with Article 7 of the GDPR.
What Article 7 Requires
Article 7 establishes the conditions for valid consent under the GDPR. When processing relies on consent as its legal basis, the controller must be able to demonstrate that consent was:
- Freely given: The data subject must have a genuine choice
- Specific: Consent must be granular and tied to particular purposes
- Informed: The data subject must understand what they are consenting to
- Unambiguous: Consent must be given through a clear affirmative action
Article 7(2) further requires that consent requests be presented “in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language.”
Two Critical Design Failures
The AEPD identified two specific design issues that rendered Yoti’s consent mechanisms invalid:
1. Click-Through Privacy Policy: The Yoti app allowed users to click past the privacy policy screen without ever opening or reading the policy document. Users could proceed through the account creation flow by simply tapping a button, with no mechanism ensuring they had actually accessed, let alone understood, the information about how their data would be processed.
2. Default Consent for R&D: The app defaulted to consenting users’ biometric data for use in research and development activities. Rather than requiring an affirmative opt-in for this secondary purpose, the consent was bundled with the primary service consent by default. Users would need to actively seek out and disable this option — a practice that fails the GDPR’s requirement for specific, granular consent.
The Age Declaration Problem
The AEPD also scrutinised Yoti’s age-gating mechanism. The app imposed age-based controls by jurisdiction — in Spain, it presented users with two options: “I am 14 or over” or “I am 13 or under,” only allowing the account creation process to continue if the first option was selected. Critically, no mechanism existed to verify whether the user’s self-declaration was accurate. The biometric data of minors as young as 13 could therefore be processed with no differentiated safeguards, despite the heightened protections the GDPR affords to children’s data under Recital 38.
Violation 3: Excessive Data Retention (Article 5(1)(e) GDPR) — €250,000
The third violation addresses the GDPR’s storage limitation principle, which requires that personal data be “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed.”
Yoti’s Retention Practices
The AEPD found several of Yoti’s data retention practices to be disproportionate and in violation of Article 5(1)(e):
Geolocation data — 5 years: Yoti collected location data to determine which jurisdiction’s age restrictions applied to the user. This geolocation data was retained for five years in the company’s systems. The AEPD found this retention period grossly disproportionate to the stated purpose — determining applicable age thresholds is a one-time operation that requires no long-term storage of precise location records.
Biometric templates — duration of account + 3 years: According to Yoti’s own Data Protection Impact Assessment (DPIA), age tokens and Digital ID app data were retained for as long as the user maintained an active account, or for three years following the last activity. The biometric template captured during the initial selfie verification was stored throughout this entire period. The AEPD found this disproportionate because the biometric template was stored for potential future account recovery — a contingency that might never materialise.
Liveness detection recordings — 30 days: Video recordings captured during liveness detection (used to verify that a real person, not a photograph, was presenting to the camera) were retained for 30 days.
Fraudulent ID documents — indefinite: Identity documents identified as fraudulent during failed verification attempts were retained beyond their original purpose and repurposed to train Yoti’s verification algorithms. The AEPD objected to this secondary use: a document submitted as evidence of identity becomes a permanent training asset without the submitter’s consent for that purpose.
Why Five Years of Location Data Is Excessive
The five-year retention of geolocation data stands out as particularly egregious. Consider the purpose: Yoti collects location data to determine whether a user is in Spain (where the age of digital consent is 14) versus the UK (where it is 13). This is a binary determination required at the moment of account creation. There is no legitimate purpose for retaining precise location records for five years afterward.
Five years of location data can reveal intimate details about a person’s life — where they live, work, travel, worship, seek medical care, and with whom they associate. For an age verification service, maintaining such a detailed location history represents a significant privacy risk with no proportionate benefit.
Under the GDPR’s data minimisation principle (Article 5(1)(c)), controllers should collect only data that is adequate, relevant, and limited to what is necessary. Under the storage limitation principle (Article 5(1)(e)), they should retain it no longer than required. Yoti’s five-year retention of location data fails on both counts.
Yoti’s Response and Appeal
Yoti has responded forcefully to the AEPD’s decision, issuing a public statement rejecting the ruling:
“Yoti is extremely disappointed to confirm that we have recently been sanctioned by the Spanish Data Protection Agency (AEPD) for infringements of data protection law relating to the Yoti Digital ID app. Yoti rejects in the strongest possible terms the decision of the AEPD and has begun the appeal process to the Spanish High Court.”
The company has emphasised several key points in its defence:
- No data breach occurred: “No personal data of any app user has been breached or compromised in any way — the Yoti Digital ID app remains secure”
- Scope is limited: “The findings relate to the Digital ID app only” — not to Yoti’s broader age estimation or verification services provided to business clients
- Procedural concerns: “We fully cooperated with the AEPD’s information requests, but we were never notified that we were under investigation”
The procedural complaint is notable — Yoti claims it was unaware it was the subject of a formal investigation despite cooperating with information requests. Whether this procedural objection gains traction on appeal remains to be seen.
Yoti has up to one month to formally appeal the decision to the Spanish High Court. However, the six-month compliance deadline runs independently of the appeal process, meaning Yoti must demonstrate corrective measures regardless of whether its legal challenge succeeds.
Context: The AEPD’s Growing Appetite for Biometric Enforcement
The Yoti fine does not exist in isolation. Spain’s data protection authority has shown an increasing willingness to enforce GDPR requirements against biometric data processing.
The Aena Airport Precedent
In November 2025, the AEPD imposed a €10 million fine on Aena, Spain’s national airport operator (and the world’s largest by passenger volume), for GDPR violations related to its biometric boarding system deployed across eight airports. The authority ordered Aena to suspend the facial recognition boarding programme entirely. Like Yoti, Aena has appealed the decision, characterising the fine as “disproportionate.”
Together, the Aena and Yoti cases signal that the AEPD views biometric data processing as a priority enforcement area and is willing to impose significant penalties even on companies and entities that argue their use of biometrics is proportionate and consumer-friendly.
Broader European Biometric Enforcement
The Yoti case fits within a broader pattern of European regulatory action on biometric data:
| Entity | Regulator | Fine | Year | Issue |
|---|---|---|---|---|
| Clearview AI | Dutch DPA (AP) | €30.5 million | 2024 | Scraping facial images without consent for biometric database |
| Clearview AI | French CNIL | €20 million | 2022 | Illegal collection and use of biometric data |
| Clearview AI | Italian Garante | €20 million | 2022 | Unlawful facial recognition database |
| Clearview AI | Greek DPA | €20 million | 2022 | GDPR violations in facial recognition |
| Aena | Spanish AEPD | €10 million | 2025 | Biometric boarding without adequate safeguards |
| Yoti | Spanish AEPD | €950,000 | 2026 | Biometric data in digital ID app |
While the Yoti fine is smaller in absolute terms compared to the Clearview AI penalties, the case is arguably more significant for the industry because Yoti’s business model — age verification and digital identity — is precisely the type of service that governments are increasingly mandating.
The Tension: Government Mandates vs. Privacy Enforcement
The Yoti enforcement exposes a fundamental tension in European digital policy. On one side, EU and national governments are accelerating mandates for online age verification:
- The EU’s eIDAS 2.0 Regulation requires member states to offer citizens a European Digital Identity Wallet by 2026, establishing standardised digital identity infrastructure across the bloc
- The European Commission has developed a blueprint for age verification currently being piloted in Denmark, France, Greece, Italy, and Spain
- Spain’s own online safety legislation is among the most comprehensive in Europe for age checks on digital platforms
- Australia has passed laws barring children under 16 from social media entirely
- Over half of US states now mandate or are considering mandating age verification for adult content or social media platforms
On the other side, data protection authorities are enforcing strict requirements on the very infrastructure needed to deliver these mandates. Companies like Yoti exist specifically to service government-mandated age verification requirements, yet find themselves fined for the technical methods necessary to provide those services.
This creates a challenging compliance environment. Age verification inherently requires some form of identity processing. The most reliable methods — biometric verification, document scanning, facial age estimation — all involve processing special category data or sensitive personal information. The AEPD’s ruling makes clear that providing these services requires navigating extremely stringent GDPR requirements, even when the services exist to fulfil legal obligations imposed by other arms of the same government.
Lessons for Organizations Using Biometric Verification
The Yoti case offers critical compliance lessons for any organization processing biometric data, particularly in the digital identity and age verification space.
1. Biometric Authentication ≠ Exemption from Article 9
The AEPD’s ruling establishes that 1:1 biometric matching (authentication) triggers Article 9 protections just as 1:N biometric identification does. Organizations cannot avoid special category data requirements by characterising their biometric processing as “authentication” rather than “identification.” If you are comparing a live biometric sample against a stored template to confirm identity, you are processing special category data.
Action item: Review all biometric processing operations. If any involve comparing biometric data against stored templates — regardless of whether the comparison is 1:1 or 1:N — ensure you have a valid legal basis under Article 9(2).
2. Consent Must Be Genuinely Informed and Granular
The €200,000 consent violation demonstrates that regulators will scrutinise the design of consent flows, not just their existence. A privacy policy that users can click past without reading does not constitute informed consent. Default opt-ins for secondary purposes (like R&D) do not constitute specific, granular consent.
Action items:
- Ensure users must actively access and acknowledge privacy information before consenting
- Separate consent for primary service delivery from consent for secondary purposes (analytics, R&D, algorithm training)
- Implement affirmative opt-in mechanisms rather than pre-checked boxes or default settings
- Document consent flows and maintain evidence that consent was freely given, specific, informed, and unambiguous
3. Data Retention Must Be Purpose-Limited and Proportionate
The €250,000 retention violation reinforces that organizations must justify every category of data they retain and every retention period they set. The key question is always: is this data still necessary for the specific purpose for which it was collected?
Action items:
- Conduct a data retention audit across all personal data categories
- Map each data category to its specific processing purpose and define the minimum retention period required
- Implement automated deletion schedules aligned with purpose-limited retention periods
- Eliminate retention of data “just in case” — storing biometric templates for hypothetical future account recovery is not proportionate
- Pay particular attention to location data, which can reveal deeply personal information and should be retained for the absolute minimum period necessary
4. Purpose Limitation Applies to Secondary Uses
Retaining fraudulent identity documents to train verification algorithms is a secondary purpose that requires its own legal basis. Data collected for one purpose cannot be repurposed for another without additional justification under Article 6 (and Article 9 for special category data).
Action item: If you retain data from failed or fraudulent verification attempts for algorithm training, ensure you have a separate legal basis for this processing and have informed data subjects of this secondary purpose.
5. Children’s Data Requires Heightened Protection
Age verification services will inevitably process data belonging to minors. The GDPR’s Recital 38 states that children “merit specific protection with regard to their personal data.” Self-declaration age gates (where a child can simply claim to be 14 or older) provide no meaningful protection.
Action items:
- Implement age-appropriate design principles
- Consider whether additional safeguards are needed when processing minors’ biometric data
- Evaluate whether parental consent mechanisms (required under Article 8 for children below the age of digital consent) are adequate
6. DPIAs Must Be Thorough — and Followed
It is notable that the AEPD cited Yoti’s own Data Protection Impact Assessment (DPIA) against the company. Yoti’s DPIA documented the retention practices and data flows that the AEPD ultimately found non-compliant. A DPIA that identifies risks but does not mitigate them can become evidence of awareness without adequate action.
Action item: Ensure your DPIA process includes a feedback loop where identified risks are genuinely addressed, not merely documented.
Data Retention Best Practices: A Compliance Framework
Given the prominence of the data retention violation in the Yoti case, organizations should consider implementing the following framework:
Define Purpose-Limited Retention Periods
For each category of personal data:
- Identify the specific purpose for which the data was collected
- Determine the minimum period the data is needed to fulfil that purpose
- Set a maximum retention period with automated deletion at expiry
- Document the justification for each retention period
Recommended Maximums by Data Category (Age Verification Context)
| Data Category | Suggested Maximum Retention | Rationale |
|---|---|---|
| Geolocation (jurisdiction determination) | Session only / immediate deletion | Purpose fulfilled at point of collection |
| Liveness check recordings | 24–72 hours | Sufficient for fraud investigation |
| Biometric templates | Duration of active purpose only | Delete when purpose is fulfilled |
| Identity document images | Verification completion + regulatory minimum | Comply with AML/KYC where applicable |
| Age verification results | Duration of service relationship | Token-based; no need to retain underlying data |
| Fraud investigation data | As required by law | Must have separate legal basis |
Implement Technical Controls
- Automated deletion: Deploy scheduled data purging aligned with retention policies
- Access controls: Restrict access to retained personal data to authorised personnel only
- Audit trails: Maintain logs of data deletion activities
- Regular reviews: Conduct quarterly or semi-annual reviews of retained data against policy
What This Means for the Digital Identity Industry
The Yoti enforcement action sends a clear message to the rapidly growing digital identity and age verification industry: compliance is not optional, even when your services fulfil government mandates.
For Digital ID Providers
Companies offering digital identity services must invest in privacy-by-design architectures that minimise biometric data processing, implement granular consent mechanisms, and enforce strict data retention limits. The “move fast and iterate” approach common in technology companies is incompatible with the GDPR’s requirements for biometric data processing.
For Platform Operators Using Age Verification
Businesses that integrate third-party age verification services should conduct due diligence on their vendors’ GDPR compliance. Under the GDPR’s controller-processor framework, platform operators may bear responsibility for selecting non-compliant verification partners.
For Regulators
The AEPD’s ruling demonstrates that data protection authorities are willing to enforce GDPR requirements against the technical infrastructure underpinning age verification mandates. Other national DPAs may follow Spain’s lead, particularly as the EU’s eIDAS 2.0 implementation timeline approaches and digital identity wallets proliferate.
For Policymakers
Legislators mandating age verification should consider whether the compliance requirements they impose are achievable within the privacy framework they also mandate. The tension between “verify everyone’s age” and “don’t process biometric data without a valid legal basis” requires resolution at the policy level, not just the enforcement level.
Timeline of Key Events
| Date | Event |
|---|---|
| December 2023 | AEPD initiates investigation into Yoti’s practices |
| September 2024 | Yoti’s white paper on facial age estimation technology updated |
| March 2025 | Yoti’s revenue figures (€15M since inception) reported |
| November 2025 | AEPD fines Aena €10M for biometric boarding system |
| March 10, 2026 | AEPD publishes resolution fining Yoti €950,000 |
| March 2026 | Yoti announces appeal to Spanish High Court |
| September 2026 (approx.) | Deadline for Yoti to demonstrate corrective measures |
Conclusion
The AEPD’s €950,000 fine against Yoti is more than a penalty against a single company — it is a regulatory statement about the boundaries of biometric data processing in the digital identity space. The ruling establishes that authentication-based biometric matching falls under Article 9’s special category protections, that consent-by-default is insufficient, and that retaining location data for five years is disproportionate for an age verification service.
For compliance professionals, the case underscores the need for rigorous privacy-by-design practices when deploying biometric technologies. The digital identity industry is growing rapidly, driven by government mandates and market demand, but growth without proportionate privacy safeguards will meet increasing regulatory resistance.
Organizations processing biometric data should conduct immediate reviews of their legal basis for processing, consent mechanisms, and data retention policies. The six-month compliance window imposed on Yoti serves as a useful benchmark: if your organization could not demonstrate GDPR compliance within six months of a regulatory inquiry, the time to begin remediation is now.
This article is provided for informational purposes and does not constitute legal advice. Organizations should consult qualified legal counsel for guidance on their specific compliance obligations under the GDPR and applicable national data protection laws.
Sources: AEPD Resolution EXP202317887 (March 10, 2026); Yoti Ltd public response; Biometric Update; ID Tech Wire; PPC.land; Digital Policy Alert.



