Britain’s communications regulator, Ofcom, has launched a formal investigation into the anonymous online message board 4chan, scrutinizing whether the platform has adequately protected users from illegal content under the UK’s Online Safety Act 2023. Announced on June 10, 2025, the probe targets 4chan’s potential failures in assessing risks and implementing measures to mitigate harmful material, including child sexual exploitation and abuse. This move comes amid growing regulatory pressure on digital platforms to curb illicit activities, with Ofcom wielding new powers to impose fines up to 10% of a company’s global revenue or £18 million, whichever is higher.
The investigation is part of a broader enforcement wave, encompassing nine platforms in total, including seven file-sharing services and a pornography provider called First Time Videos. According to Ofcom’s official statement, 4chan allegedly ignored requests for information and may lack proper risk assessments required by the Act, which mandates services to proactively identify and remove illegal content.
Escalating Scrutiny on High-Risk Platforms
Industry observers note that 4chan, known for its unmoderated forums and history of hosting controversial content, represents a test case for the Online Safety Act’s reach. The platform’s structure, with anonymous posting and minimal oversight, has long drawn criticism for enabling the spread of hate speech, extremism, and illegal imagery. Ofcom’s action follows complaints about unchecked illegal activities, as highlighted in a June 11, 2025, article by heise online, which detailed the regulator’s response to reports of violations on the notorious image board.
Parallel probes into file-sharing sites underscore Ofcom’s focus on ecosystems prone to distributing child sexual abuse material. For instance, the regulator is examining whether these services have failed to deploy age verification or content moderation tools, echoing concerns raised in a June 10, 2025, report from The Guardian, which noted the investigations’ emphasis on protecting vulnerable users.
Implications for Global Tech Compliance
The Online Safety Act, effective since late 2023, imposes duties on user-to-user services to conduct risk assessments and ensure user safety, particularly for children. Non-compliance could lead to substantial penalties, and in severe cases, Ofcom might seek court orders to block access in the UK. This has sparked debates in online communities, with Reddit threads on subreddits like r/ukpolitics and r/Asmongold discussing potential fines of £20,000 per day, as referenced in posts from August 17, 2025, reflecting user anxieties over censorship versus safety.
For tech insiders, the 4chan case highlights the challenges of regulating decentralized platforms. Unlike social media giants with robust moderation teams, 4chan’s model relies on volunteer janitors, which critics argue is insufficient under the new law. A June 10, 2025, piece in BBC News reported that Ofcom’s investigations extend to pornography sites for age-check failures, signaling a comprehensive crackdown on non-compliant operators.
Potential Outcomes and Industry Ripple Effects
If violations are confirmed, 4chan could face operational overhauls, including enhanced moderation or geoblocking UK users. This aligns with Ofcom’s recent actions against dozens of adult content sites, as covered in an August 1, 2025, article by UKTN, which detailed probes into age verification compliance.
Broader implications include heightened compliance costs for platforms worldwide, potentially influencing regulations in other jurisdictions. As Ofcom enforces these rules, the balance between free expression and harm prevention remains a flashpoint, with 4chan’s fate likely to set precedents for similar sites.


WebProNews is an iEntry Publication