Britain’s communications regulator Ofcom has launched an investigation into Telegram following evidence that child sexual abuse material may have been shared on the platform. The probe is part of the United Kingdom’s broader push to strengthen online child protection and enforce accountability under the Online Safety Act 2023.

The investigation follows submissions from the Canadian Centre for Child Protection, alongside Ofcom’s own assessment of Telegram’s compliance with rules governing illegal content. Regulators will examine whether the platform has failed to meet its obligations to prevent harmful material and protect minors.
The move comes as Prime Minister Keir Starmer intensifies pressure on social media companies, including Meta Platforms, YouTube, and TikTok, to strengthen safeguards for young users. The UK government is also consulting on proposals that could restrict social media access for children under 16.
In response, Telegram denied the allegations, stating it has “virtually eliminated” the public spread of such material since 2018 through advanced detection technologies. The company also raised concerns that the investigation could signal broader challenges for platforms prioritizing privacy and free speech.
Separately, Ofcom has opened investigations into platforms such as Teen Chat and Chat Avenue, citing concerns over potential grooming risks. The regulator said it remains unconvinced that these services are providing adequate protection for children.
Suzanne Cater, Ofcom’s enforcement director, warned that companies failing to comply with child safety requirements could face strict penalties under the Online Safety Act, reinforcing the UK’s tougher stance on digital platform responsibility.
Ofcom can impose fines of up to £18 million or 10 percent of worldwide revenue, whichever is greater.
Ofcom in March said it issued 16 fines under the Online Safety Act against six companies, totalling nearly £4 million.
Additionally, Ofcom provided updates on file-sharing services that are now either using hash-matching technology to detect and swiftly remove child sexual abuse material (CSAM) or taken steps to prevent people in the UK from accessing their sites. Ofcom provided updates on file-sharing services such as Pixeldrain and Yolobit.
Earlier, Ofcom fined 4chan £450,000 for not having age checks in place to prevent children from seeing pornography on its site. The UK’s online safety watchdog has also fined the company £50,000 for not assessing the risk of illegal material appearing on its platform, and £20,000 for not setting out in its terms of service how it protects people from criminal content.
Data shows that nearly 80 percent of the top 100 pornography sites in the UK have age checks in place. This means that on average, every day, over 7 million visitors from the UK are accessing pornography services that have deployed age assurance.
Why India has not initiated a probe into Telegram over child safety concerns
India has not launched a formal probe into Telegram over child safety concerns largely due to differences in regulatory approach and enforcement priorities. Unlike the UK’s Online Safety Act 2023, India relies on the Information Technology Act 2000 and Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021, which emphasize compliance through takedown requests rather than proactive investigations.
Indian authorities typically act upon specific complaints or court orders instead of initiating independent probes. Additionally, Telegram has faced scrutiny in India mainly over piracy and misinformation, not large-scale documented child safety violations. Limited public evidence, jurisdictional challenges due to Telegram’s overseas base, and enforcement resource constraints have also contributed to the absence of a dedicated investigation so far.
BABURAJAN KIZHAKEDATH
