Ofcom Expands Investigation into Telegram and Teen Chat Platforms Amid Child Safety Concerns
The UK communications regulator, Ofcom, has broadened its investigation into major online platforms, including Telegram, Teen Chat, and Chat Avenue. This inquiry aims to determine whether these services are adequately preventing child sexual abuse and online grooming. The investigation falls under the Online Safety Act, which mandates platforms to assess and mitigate risks associated with illegal content, particularly child sexual abuse material (CSAM).
Ofcom’s decision to launch this investigation was prompted by evidence suggesting the presence of harmful content and predatory behavior on these platforms, raising significant concerns regarding user safety, especially for minors.
Ofcom Investigation Into Telegram over CSAM Risks
A central focus of the Ofcom investigation is Telegram and its potential exposure to CSAM. Authorities have confirmed that they received intelligence from the Canadian Centre for Child Protection, indicating the alleged sharing and presence of CSAM on the platform. This intelligence has led Ofcom to conduct its own assessment to determine whether Telegram has failed to meet its legal obligations under the Online Safety Act.
In the UK, both the possession and distribution of CSAM are criminal offenses, placing a substantial responsibility on platforms to actively detect and remove such content. Regulators have stated that platforms facilitating user-to-user communication must implement robust systems to identify and mitigate risks. The ongoing investigation will evaluate whether Telegram has sufficient safeguards in place or if enforcement gaps have allowed illegal content to circulate unchecked.
Teen Chat Platforms Under Scrutiny for Grooming Risks
The Ofcom investigation also encompasses Teen Chat and Chat Avenue, which are being scrutinized for their potential role in facilitating online grooming. These platforms provide features such as open chatrooms, private messaging, and media sharing, which regulators argue can be exploited by predators.
Online grooming can involve coercing minors into sharing explicit content, engaging in sexual conversations, or arranging offline meetings. Ofcom has been collaborating with child protection agencies to pinpoint services where such risks are heightened. Despite previous engagements with these companies, the regulator remains unconvinced that adequate protections are in place. The investigation will assess whether these platforms are effectively evaluating risks and taking necessary steps to shield children from harmful or illegal activities. In the case of Chat Avenue, the inquiry will also examine whether sufficient safeguards exist to prevent minors from accessing explicit content.
File-Sharing Platforms Show Mixed Progress
In addition to messaging and chat services, Ofcom has also reviewed file-sharing platforms, which have historically been utilized to distribute CSAM. Some progress has been noted in this area. For example, Pixeldrain has implemented perceptual hash-matching technology, enabling automated detection and removal of known abusive content. This development followed Ofcom’s concerns regarding the platform’s initial lack of safeguards.
Another service, Yolobit, has restricted access to users in the UK, leading Ofcom to conclude its investigation. Several other file-sharing providers have adopted similar measures, either blocking UK access or deploying detection technologies in response to regulatory action. These advancements indicate that regulatory pressure is prompting some platforms to enhance their safety measures, although the Ofcom investigation reveals that broader risks persist across various online services.
Enforcement Powers and Next Steps
The Ofcom investigation operates under a structured process outlined in the Online Safety Act. Regulators will gather and analyze evidence before determining whether a platform has breached its legal obligations. Companies will have the opportunity to respond before any final decisions are made.
If violations are confirmed, Ofcom possesses the authority to impose stringent penalties, including fines of up to £18 million or 10 percent of global annual revenue. In more severe cases, courts can enforce business disruption measures, such as requiring internet service providers to block access to a platform in the UK or cutting off payment and advertising services.
Suzanne Cater, Director of Enforcement at Ofcom, emphasized that addressing child exploitation remains a top priority. While some progress has been observed, particularly among file-sharing services, risks continue to exist across larger platforms and youth-focused chat services.
Growing Pressure on Platforms to Comply
The Ofcom investigation underscores the increasing regulatory scrutiny faced by online platforms operating in the UK. Under the Online Safety Act, any service accessible to UK users must adhere to local laws, irrespective of the company’s location.
With investigations now underway across messaging apps, chat platforms, and file-sharing services, the regulator is signaling that failure to protect users, especially children, will have serious repercussions. As the Ofcom investigation progresses, further updates are anticipated regarding whether these platforms will face enforcement actions or be required to enhance their safety measures.
Source: thecyberexpress.com
Keep reading for the latest cybersecurity developments, threat intelligence and breaking updates from across the Middle East.


