Concerns have resurfaced regarding Telegram’s role in the dissemination of illegal content, particularly child sexual abuse material, prompting renewed scrutiny from regulators and child protection organizations worldwide. The messaging platform, known for its strong encryption and privacy features, has faced allegations that its design enables the circulation of harmful material despite efforts to moderate content. These claims approach amid broader debates about balancing user privacy with platform accountability in the digital age.
Telegram, founded in 2013 by brothers Nikolai and Pavel Durov, has grown to over 700 million monthly active users as of 2024, positioning itself as a major competitor to other messaging apps like WhatsApp and Signal. The platform emphasizes security through end-to-end encryption in private chats and secret chats, though regular cloud chats are not encrypted by default. This distinction has been central to discussions about how content moderation can occur on the service, especially given that group chats and channels—where much of the alleged illegal activity is said to occur—do not benefit from end-to-end encryption.
In recent years, law enforcement agencies in several countries have reported encountering illegal content shared via Telegram, including extremist propaganda and child sexual abuse material. A 2023 report by the Internet Watch Foundation noted that Telegram was among the platforms used to share such content, though it did not rank among the top sources compared to other hosting services. The company has maintained that it removes illegal content when reported and cooperates with authorities under legal frameworks, citing its terms of service which prohibit the distribution of harmful material.
Despite these statements, critics argue that Telegram’s moderation practices lack transparency and responsiveness. Unlike some platforms that employ large teams of moderators and AI-assisted detection systems, Telegram has historically relied on user reporting and a smaller internal team. In 2022, the platform introduced a feature allowing users to report illegal content directly within the app, a move welcomed by safety advocates but described by some as insufficient given the scale of the network.
The company’s stance on privacy has also drawn both praise and criticism. Telegram has resisted government requests to weaken encryption or provide backdoors, asserting that such measures would compromise user security for all. This position has led to periodic conflicts with authorities; for example, Russia attempted to block Telegram in 2018 over its refusal to hand over encryption keys, though the ban was lifted in 2020 after the company reportedly agreed to cooperate more closely with local regulators on terrorism-related cases.
More recently, in 2023, the European Union’s Digital Services Act (DSA) began imposing stricter obligations on large online platforms, including requirements for risk assessments, content moderation transparency and rapid removal of illegal content. Telegram, which surpassed the 45 million user threshold in the EU, is classified as a remarkably large online platform under the DSA and must comply with these rules or face potential fines of up to 6% of global turnover. The company has stated it is working to meet its obligations, including publishing transparency reports and establishing a point of contact for EU authorities.
Child safety organizations continue to urge Telegram to adopt more proactive measures, such as implementing hash-matching technology to detect known illegal images and improving reporting mechanisms. They emphasize that although encryption protects legitimate user privacy, it should not impede the identification and removal of material that exploits minors. Some experts suggest that client-side scanning—controversial due to privacy concerns—could be one avenue, though Telegram has not indicated plans to adopt such technology.
As of early 2024, Telegram has not publicly disclosed the volume of illegal content reports it receives or acts upon, though it publishes biannual transparency reports detailing government requests for user data. The most recent report, covering the second half of 2023, showed a increase in legal requests from various jurisdictions, though the company did not break down how many pertained to child safety or terrorism-related investigations.
Moving forward, regulators and advocacy groups are calling for greater clarity on how Telegram balances its privacy commitments with its responsibility to prevent harm. The outcome of ongoing DSA compliance evaluations in Europe may set a precedent for how similar platforms handle these challenges globally. For users seeking guidance on safety features, Telegram provides in-app reporting tools and safety resources through its official website, though experts recommend additional vigilance, especially in public channels and groups where moderation may be limited.
As discussions continue about the role of encrypted platforms in online safety, the focus remains on finding solutions that protect both user rights and vulnerable populations. No single approach has gained universal acceptance, but sustained dialogue between technology companies, regulators, and civil society is seen as essential to addressing these complex challenges.
For the latest updates on Telegram’s compliance with regional regulations and safety initiatives, readers can refer to official announcements from the company’s blog or statements from regulatory bodies such as the European Commission. Staying informed through credible sources helps ensure a balanced understanding of both the benefits and risks associated with modern communication platforms.
We welcome your thoughts on this important topic. Share your perspective in the comments below, and consider sharing this article to help foster informed discussion about technology, safety, and responsibility in the digital world.