Platforms like Telegram Accused of Facilitating Child Sexual Abuse Can’t Invoke Free Speech Protections
Respectable voices shouldn’t fall for Elon Musk’s hysterical claims that the arrest of Pavel Durov, the messaging service’s billionaire CEO, is part of a global jihad against speech rights
“POV: It’s 2030 in Europe and you’re being executed for liking a meme,” Elon Musk dramatically proclaimed on X last Saturday. The ominous pronouncement came as news broke of the arrest in France of his fellow tech-billionaire Pavel Durov, founder and CEO of Telegram, a messaging app. X erupted in howls of outrage as right-wing influencers and outrage-peddling newsletters anointed Durov the latest martyr in the war against free speech. “Darkness is descending fast on the formerly free world,” warned Tucker Carlson with the grim certainty of someone who’d seen it all before. Their outrage was solidified, their solidarity declared, with nothing known except that the arrest had happened. The narrative of an authoritarian French regime ensnaring a tech titan as a political prisoner was too perfectly aligned with their beliefs to wait for facts.
Since then, more mainstream commentators have weighed in about the free speech implications of Durov’s arrest—amplifying the frame without grappling with the facts. The Washington Post’s conservative columnist Megan McArdle, who has a reputation for nuance, questioned “whether it’s worth sacrificing our own liberties to make it easier for the government to stop” criminal activity. This is a foundational and perennial question, but this case is not about balancing competing rights so much as reckoning with a specific platform’s complicity in criminal activity. Likewise, Reason magazine ran a blaring headline: “Telegram CEO Pavel Durov’s Arrest Is Part of a Global War on Free Speech,” arguing that “governments around the world seek to suppress ideas and control communications channels.” While that may hold true in other contexts, conflating it with this case is both misguided and counterproductive.
Content moderation on social media platforms, a term that reactionaries have sought to make synonymous with “censorship,” is a broad practice that covers everything from disrupting terrorists to dealing with trolls. It often involves making hard choices about online speech to shape platform norms and conduct that enable the platform to deliver a certain experience to users. It does at times veer into bad calls that stifle free expression—but the other side of the equation involves tackling actual crimes.
Durov’s arrest is not about the former but the latter. Explicitly criminal activities have flourished on Telegram for years. Reducing this incident to a skirmish in some “global war on free speech,” a narrative popularized by Musk and his Twitter Files profiteers, oversimplifies the complexities of content moderation and ignores the nuance of this specific case, turning committing crimes into an issue of free speech.
While the outraged defenders of Durov rush to cast him as a martyr for free speech, the facts suggest a darker reality—one where a tech platform’s negligence may have facilitated heinous crimes, and where accountability is now being reframed as censorship.
Nothing to Do with Free Speech
The confirmed facts are as follows: Durov was detained at the Bourget airport outside Paris on Aug. 24 but is now out on bail. An Aug. 26 press release by French authorities listed a variety of concerns offering potential clues of the case against Durov. Among them: Complicity in enabling illegal transactions in organized groups; complicity in the possession and distribution of pornographic images of minors; complicity in trafficking narcotic substances; refusal to communicate information or documents necessary for carrying out legal operations. And, as per Politico, “refusal to cooperate with a French police inquiry into child sex abuse.”
On Wednesday, preliminary charges offered more details: refusal to provide, at the request of authorized authorities, information or documents authorized by law; complicity in offences such as distribution of child pornography, drug trafficking, and organized fraud; organized crime laundering. Contrary to the assumptions of Musk and his fans, nowhere in the charges is any mention of “liking memes,” or “insufficient moderation of legal speech.” There is a somewhat inscrutable charge related to “provision of cryptology services without a declaration of conformity” that concerned some privacy activists, though others following the case suggest it may simply be French authorities charging Durov for not filing legally-required operational paperwork (perhaps something of a pile-on charge). Other than that, however, the charges focus on explicitly illegal activity—most notably, the refusal to comply with a child sexual abuse investigation.
Not an Encrypted App
Understanding what Telegram is, and the role it plays in the social media ecosystem, helps to clarify the divide between the real issue and the hysterical claims.
Telegram functions both as a messaging app for personal and group chats, and a mass-broadcasting tool offering channels for public posts. The messaging app part is fairly standard: users can send messages one-on-one or in groups. However, it’s crucial to note that unlike WhatsApp or Signal, private messages on Telegram are not end-to-end encrypted by default—users must actively enable this setting, which would then mean that no one except the two people involved in the conversation would be privy to what was said. But groups on Telegram are not encrypted at all. If you spend a lot of time on X, you may have heard the opposite! Elon Musk goes so far as to argue that Telegram is more secure than Signal (which, according to him, had a woke board member). Durov has made this claim as well—but it is patently false. Content on Telegram that is not end-to-end encrypted is accessible to the people who run Telegram, which is one of the reasons that Durov was charged.
Telegram’s broadcast tools, used to share public posts with followers, are popular with media and activists worldwide. When Russia invaded Ukraine, Telegram channels figured heavily in the propaganda war on both sides. When Hamas attacked Israel on Oct. 7, and as the conflict has continued, Telegram channels provided raw visuals of the atrocities, showing what mainstream media often sanitized. There is a lot of political speech on Telegram, across nations and ideologies, and this is the type of speech that deserves protection even when it is odious. Similarly, when the pre-Musk Jack Dorsey-led leadership team at Twitter took down 70,000 QAnon accounts in the days following the Jan. 6 insurrection, QAnon influencers and election deniers, including prominent figures like Sidney Powell and General Mike Flynn, set up shop on Telegram. These accounts were a mixed bag with some advocating violence and some simply expressing political opinions.
It’s worth explaining why Telegram became a hub for alt-media and activists. On private social media platforms, the rulebook of a company dictates what kinds of content it will allow. Telegram largely avoids moderation. Some view this as a feature, others as a bug, but the result is that every conceivable opinion and version of “reality” can be found on Telegram, for better or worse.
Some of the content falls into what Stanford’s Daphne Keller calls “lawful-but-awful” territory, including harassment, conspiracy theories, and hate speech (although the last is illegal in some countries but not in the United States). Platforms are generally free to set their own rules for content, and some have turned their lack of certain types of moderation policies—no fact-checking, for example—into their brand. But although social media platforms are perceived as public squares, they are also private companies which mold their policies and products in response to user and advertiser demands, and their own business incentives. As more platforms emerge, people have the option to choose the type of environment they wish to spend time in.
Telegram’s Irresponsible, Everything-Is-Allowed Business Model
The increasing number of social media platforms has led to regulatory arbitrage in content moderation: bad actors migrate to platforms with laxer rules or weaker enforcement. One undeniable consequence of Telegram’s laissez-faire moderation policies is that it became a haven for people looking to do Explicitly Illegal Things. Bad actors expelled from other social media platforms settled there. ISIS, for example, was a prominent early adopter, turning to Telegram as platforms like Facebook and Twitter disrupted their propaganda networks. Criminal enterprises saw its potential and set up shop. And, as the French charges underscore, people interested in abusing children have gravitated there as well.
Telegram has long turned a blind eye to illegal activity, including crimes against children; if anything, it’s surprising that it hasn’t had more run-ins with law enforcement. In my previous role as Technical Research Manager at Stanford Internet Observatory, Telegram was one of the platforms we examined when tracking how narratives spread online and while investigating state propaganda operations. When we focused on gathering data about geopolitical events, such as the Russian invasion of Ukraine and the Oct. 7 massacre in Israel, we pulled real-time public data from public Telegram groups.
To do this work safely and legally on social media platforms, we set up a data ingest pipeline to filter posts through a tool called PhotoDNA that also screens for known child exploitation content. We got PhotoDNA hits in public channels, including QAnon channels, on Telegram on multiple occasions. When this happened, we filed reports, sending the URL of the posts to the National Center for Missing and Exploited Children (NCMEC). In a paper on child sexual abuse material (CSAM) that discussed how platforms responded to such material, we noted that Telegram implicitly allows the trading of such content in private channels: “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them,” the platform writes. It requests that users not post illegal pornographic content on “publicly viewable” spaces. It offers no warnings to users about not engaging in illegal activity on the platform, and includes this reassuring statement in its FAQ, “To this day, we have disclosed 0 bytes of user data to third parties, including governments.”
Not disclosing user data or content when authorities are cracking down on, say, disfavored political activity or as an act of civil disobedience to highlight bad or unjust laws is one thing. It is quite another when they are investigating legitimate crimes with actual victims such as nonconsenting children.
The presence of unencrypted child sexual abuse material on Telegram, coupled with the platform’s apparent failure to report or address it, is exactly the accountability issue the French authorities are aiming to address. While such material is a problem on end-to-end encrypted messaging apps like WhatsApp and Signal as well, those platforms are more limited in what they can detect, and therefore disclose, during legal investigations—a tradeoff that has been the subject of intense debate among privacy and child safety advocates in tech policy circles for years. Despite these limitations, Signal and WhatsApp acknowledge the gravity of this crime and their legal obligations. Hence they take action by filing reports with NCMEC and cooperating with investigating authorities when necessary—not scoff at them as Durov seems to be doing, if the charges of non-compliance against him are accurate.
Accountability Ain’t Censorship
Some have argued that it’s unfair for Durov—who is a billionaire largely due to Telegram—to be held accountable for content merely posted on his platform, content he neither posted nor sought out, simply because his company could detect it.
But that’s no excuse for not complying with an investigation. Nor are the French authorities unleashing any kind of unprecedented tyranny by charging Durov. U.S.-based social media platforms—including X—are required to report any sexually explicit content involving children that they detect (or that users report) to NCMEC; Telegram does not. They are also legally obligated to cooperate with other types of criminal investigations. While they may choose to challenge a legal request for information, the protection they enjoy for hosting what in the U.S. is “lawful-but-awful” content does not apply to explicitly illegal material even here.
Elon Musk’s Chorus of Conspiracy Theorists
As I write this, Elon Musk, Tucker Carlson, and a chorus of conspiracy theorists have escalated their hysteria campaign, suggesting, also without any evidence, that the “Biden-Harris” administration may have been involved in Durov’s arrest.
Free speech deserves to be jealously guarded. But the ill-advised and off-base response to Durov’s indictment on platforms like X that encourage follower-chasing influencers to engage in performative moral outrage does not advance this cause. To the contrary, in fact. If it seems that free speech has become a license for terrorism, money laundering, and the sexual abuse of children—and is being used to thwart the investigation of these crimes—this cherished right might well face a public backlash.
Durov and Telegram might yet be exonerated from the charges they face—and we should indeed reserve judgment as the case unfolds. But nothing about his arrest to date suggests that it has anything to do with “free speech.”
The narrative that global authoritarian censors are persecuting a free speech hero is pure myth-making in this case.
© The UnPopulist, 2024
Thank you for elucidating this complex subject. I would add that in any group of people, there will be those who use whatever power they have to protect other people (mostly men) who engage in child sexual abuse because, tragically, there are so many who do engage in child sexual abuse. This is why it is so difficult to curb it. At every level of criminal justice systems and in every level of government, there are predators who work to protect each other, and they are usually successful in doing so. And they can count on the naivete of too many people to go along with what they do and rationalize it by invoking freedom of speech, freedom of association and other civil rights.
A few things to consider: 1) Telegram isn’t a social media platform; it’s a messaging platform. ICQ, AIM, and other platforms that were never considered social media shouldn’t be either. The social graph isn’t a crucial aspect of the platform. 2) The design of Telegram is indeed flawed because it doesn’t offer end-to-end encryption, as mentioned in the substack. However, does that automatically make it not a platform for free speech? If it were end-to-end encrypted, what would that change?