Telegram has faced immense scrutiny this year as it attempts to address harmful content on its platform. The pressure intensified following the arrest of its founder, Pavel Durov, in France. Durov is currently facing charges related to alleged harmful material shared through the messaging app.
In response, Telegram launched a major crackdown in September and recently revealed that it has removed 15.4 million groups and channels involved in activities such as fraud and terrorism throughout 2024. The platform credits this achievement to the deployment of advanced AI-powered moderation tools, which have significantly enhanced its ability to identify and eliminate such content.
To improve transparency about its efforts, Telegram has introduced a dedicated moderation page, as shared in a recent post on Durov’s Telegram channel. The data presented on the page highlights a sharp rise in enforcement actions following Durov’s arrest in August. This move underscores Telegram’s commitment to addressing regulatory concerns and cleaning up its platform.