LONDON, April 21 (Reuters) – Britain’s communications regulator, Ofcom, launched an investigation on Tuesday into the Telegram messaging app after evidence suggested child sexual abuse material was being shared on the platform.
The probe is part of UK efforts to crack down on children being exposed to harm online without clear accountability. While the country’s 2023 Online Safety Act has set tougher standards for social media platforms such as Facebook, YouTube and TikTok, Prime Minister Keir Starmer wants them to go further.
The government has been consulting on a potential social media ban for children under 16, and Starmer met last week with social media company executives where he asked them to take more responsibility.
Ofcom said it had received evidence from the Canadian Centre for Child Protection regarding the alleged presence and sharing of child sexual abuse material on Telegram, and had carried out its own assessment of the platform.
“In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content,” Ofcom said in a statement.
Telegram said it “categorically” denied Ofcom’s accusations, adding that since 2018 it had “virtually eliminated” the public spread of child sexual abuse material on its platform through detection algorithms.
“We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy,” the Dubai-based company said in a statement.
Telegram was fined in February by Australia’s online safety regulator for delaying answering questions about measures taken to prevent the spread of child abuse and violent extremist material.
Britain’s Ofcom said on Tuesday it had also opened investigations into Teen Chat and Chat Avenue to examine whether they were meeting their duties to prevent children from the risk of being groomed by predators.
Ofcom said that after engagement with the companies, it remained unsatisfied as to whether they were providing adequate protection to British children from the risk of grooming.
“These firms must do more to protect children, or face serious consequences under the Online Safety Act,” Suzanne Cater, Director of Enforcement at Ofcom, said in the statement.
(Reporting by Muvija M; Editing by Paul Sandle and Susan Fenton)





Comments