In the annals of regulatory enforcement, there are rare moments when the facts are so strange that a compliance newsletter has no choice but to report them with a straight face while quietly marveling at the world. This is one of those moments.

Ofcom — the UK’s communications regulator and the enforcement body for the Online Safety Act 2023 — has issued a Confirmation Decision against 4chan Community Support LLC, finding the platform in breach of three provisions of the Act. The penalties include both single lump-sum fines and ongoing daily rate penalties. 4chan’s legal response, delivered by the firm Byrne & Storm, featured a giant hamster named Nigel J. Whiskerford, currently on tour in Japan, dressed as Godzilla, holding a peanut.

We will get to Nigel shortly. First, the compliance analysis your GRC team actually needs.

The Fine: What Ofcom Actually Found

Ofcom’s Provisional Decision was issued on February 12, 2026. In a move that will surprise no one familiar with 4chan’s general approach to institutional authority, the platform made zero substantive representations in response. Ofcom proceeded to a Confirmation Decision — meaning the findings stand and the penalties are now formal.

The three breaches:

Section 9(2) — Illegal Content Risk Assessment 4chan failed to conduct a suitable and sufficient illegal content risk assessment for 4chan.org. Under the Online Safety Act, user-to-user services must systematically assess the risk that their platform is used to encounter, share, or spread illegal content. “We didn’t do one” is not a compliant response. “We don’t acknowledge your jurisdiction” is also, legally speaking, not a compliant response.

Section 10(5) — Terms of Service Requirements 4chan failed to include Terms of Service provisions that specify how individuals are protected from illegal content. The OSA requires platforms to be transparent with users about how content is managed and what protections are in place. Having no meaningful ToS provisions on illegal content protection isn’t a gray area — it’s a direct breach.

Section 12 — Child Protection and Age Assurance This is the most significant breach from both a regulatory and reputational standpoint. Ofcom found that 4chan failed to protect children from encountering “primary priority content” — specifically, pornographic content — through the deployment of highly effective age assurance mechanisms. The OSA sets a high bar here deliberately: “highly effective” means more than a checkbox or a “confirm you are 18” button. The platform did not meet that bar.

4chan has until 17:00 BST on April 2, 2026 to submit confidentiality representations regarding the Decision. Beyond that deadline, the matter is settled.

The Response That Will Outlive Us All

Now. About Nigel.

4chan’s legal counsel at Byrne & Storm delivered what may be the most unorthodox formal legal communication in the history of platform regulation. In response to Ofcom’s Confirmation Decision — a serious enforcement action carrying financial penalties — the firm wrote, in part:

“Thanks. As has been explained to your agency, ad nauseam, the United Kingdom lost the American Revolutionary War. We are not in the mood to discuss the matter further, and have not been in the mood for 250 years.”

The letter continued:

“I note for the record that, last time your agency sent my client a censorship fine, we responded with a hamster joke. Since you have now sent my client a giant fine, a fine so large that Mr. Whiskers’ enclosure is not big enough to contain it, we will need to send the fine to Mr. Whiskers’ giant hamster cousin, Nigel J. Whiskerford.”

“Unfortunately, Nigel is out of the country this week, touring in Japan. Here’s a picture of Nigel in Tokyo, dressed up as Godzilla and holding an equally giant peanut.”

“My client reserves all rights and waives none. Reserved rights include the right to sue you again and/or to respond to future correspondence with an even larger rodent, such as a marmot.”

“Or, maybe, you could just stop sending Americans stupid letters and acknowledge the sovereignty of the United States.”

To be clear: this is a formal legal response to a regulatory enforcement action from a national communications authority. The argument being made is, more or less, that the UK has no authority over a US company because of events that occurred in 1776.

From a pure entertainment perspective: 10/10. From a compliance strategy perspective: this is approximately the worst possible approach a platform could take.

Why This Matters for Other Platforms

Here’s the thing about 4chan’s strategy — the “we simply refuse to engage” approach — it doesn’t make the fines disappear. It just means there are no mitigating representations on record, no evidence of good-faith engagement, and no opportunity to negotiate scope or penalty quantum. Ofcom proceeded to its Confirmation Decision without any input from 4chan, which means the regulator’s framing went entirely unchallenged.

The broader enforcement question is real: Ofcom, like all regulators, faces genuine challenges enforcing against US-based platforms that decline to participate. Domain restrictions, App Store interventions, payment processor pressure, and reciprocal diplomatic pressure are all tools that have been used or discussed. The OSA gives Ofcom significant powers — including the ability to seek court orders that could effectively block platforms in the UK. 4chan may be calculating that enforcement against a US entity is practically difficult. That calculation may be correct in the short term. It is almost certainly not a sustainable long-term strategy for platforms with meaningful UK user bases and commercial interests.

For platforms that are not 4chan — platforms with reputations to protect, advertiser relationships to maintain, and UK users who represent real revenue — the appropriate takeaway is the opposite: engage proactively, take the compliance requirements seriously, and document your good-faith efforts.

What the Online Safety Act Actually Requires

For GRC professionals who need to brief stakeholders, here is the core of what the OSA demands of user-to-user services:

  • Risk assessments: Platforms must conduct and document illegal content risk assessments. These must be suitable and sufficient — not perfunctory exercises, but genuine analyses of how the platform could be used to access or spread illegal content.
  • Terms of Service transparency: Platforms must have clear, accessible ToS provisions explaining how users are protected from illegal content. Vague, boilerplate language is unlikely to satisfy Ofcom.
  • Age assurance for adult content: Platforms that host pornographic or other “primary priority content” must implement highly effective age assurance. This is a high bar — regulators and courts have consistently rejected simple self-declaration as sufficient.
  • Safety by design: The overarching obligation is to design services with user safety built in, particularly for children. Reactive moderation after harm occurs is not compliance.

Ofcom has published detailed guidance, including codes of practice, that set out what it considers appropriate measures. GRC teams should treat these codes as near-mandatory benchmarks.

Practical Compliance Takeaways

If your platform serves UK users — even as a secondary market — the Online Safety Act applies to you. Here is what compliance teams should prioritize:

  1. Complete your illegal content risk assessment now if you haven’t. Document it. Date it. Review it annually and after any major product changes.

  2. Audit your Terms of Service for OSA-specific language. Your legal team needs to confirm you have provisions addressing how illegal content is identified, removed, and reported.

  3. Implement robust age assurance if your platform hosts any adult content. Simple date-of-birth fields will not meet the “highly effective” standard. Third-party age verification solutions are worth evaluating.

  4. Engage with regulators, even when you disagree. You are not legally required to agree with Ofcom’s findings. You are, in practical terms, significantly better off making substantive representations than sending letters about rodents and the American Revolution. Your legal team’s time is better spent building a compliance record than writing jokes that will age poorly when Ofcom publishes its enforcement decision.

  5. Monitor the enforcement pipeline. 4chan is not the only platform Ofcom is watching. As the OSA enforcement program matures, platforms that have documented their compliance efforts will be in a materially better position than those that haven’t.

As for Nigel J. Whiskerford: we wish him well in Tokyo. We hope the peanut is large enough to cover the penalties. We suspect it is not.

The Online Safety Act has teeth. Ofcom is using them. That’s the compliance story here — even if the coverage is going to be dominated by the hamster.