In the concluding instalment of our Big Story, we lay out the debate over the implications of Pavel Durov’s arrest. Is Durov just a bad apple—as other tech companies claim—who had it coming? Or does his arrest set a global precedent—opening the door to governments (like ours) eager to restrict privacy and free speech rights?
Editor’s Note: This is the final instalment of a two-part Big Story on Durov’s arrest. Part one looks at Durov’s shady rep—and the charges against him.
The Durov arrest: A quick recap
Durov was detained last week—when he landed in Paris in his private jet. He has been charged with managing an online platform that enabled “illegal transactions by an organised group.” As the founder, he is being held responsible for the crimes committed on Telegram—be it distribution of child sex abuse material and drugs—or financial fraud. Finally, he has been accused of refusing to cooperate with law enforcement.
Why this matters: Until now, the internet has been treated as the Wild West—with tech companies left mostly free to make their own rules. Over the past decade, governments have become unhappy at the lack of regulatory restraint—with good reasons and bad. And many—including New Delhi—are moving to tighten the reins:
The arrest is the "most dramatic action to date" in the "global fight between officials and tech companies over limits to harmful content", said The Washington Post. Durov's arrest "reignites a fierce debate" over free speech on social media platforms, as governments step up efforts to "police the role of social media and messaging platforms in spreading illegal and false information.”
Arresting Durov breaks new ground—opening the door to a similar escalation in other democracies.
What’s at stake: This war is about two key issues: privacy (encryption) and free speech (content moderation). The French government has laid claim to new territory in both. Is it right to do so? The answer isn’t as black & white as it seems.
The problem of moderation
Both Durov or Elon Musk claim that their aversion to strict content policies stems from their devotion to free speech. Once you crack down on, say, a racist post, then it will be open season for every other kind. But Telegram has been exceptionally lax—even by the internet’s extremely loose standards.
Where the vile stuff lives: Most notable is Telegram’s refusal to cooperate with law enforcement:
Unlike U.S.-based platforms, Telegram is not required by U.S. law to report instances of CSAM to the National Center for Missing and Exploited Children, or NCMEC. Many online platforms based overseas do so anyway — but not Telegram.
The result is a cess-pool of pedophiles and other sexual predators—who freely trade and sell child abuse videos—alongside vile deepfakes:
A U.S. Army soldier was arrested last week for allegedly creating public Telegram groups to store child sexual abuse imagery and sending himself video files that included children being raped… [In August] South Korea opened an investigation into a labyrinth of Telegram group chats where anonymous users submitted photos of South Korean girls and women that were turned into sexual imagery without their permission using artificial intelligence.
As for India: More than 1.9 million pieces of child sex abuse content were reported from India in 2019—the highest in the world. Many of these are sold on Telegram—priced anywhere between Rs 40 to Rs 5,000—discounts available, if you’re willing to bargain.
Where the terrorists live: Telegram first gained notoriety as a haven for terrorists back in 2015:
In 2015, Telegram became the primary communications platform for the Islamic State (ISIS), allowing the terror group to coordinate activities through private messages, as well as spread propaganda and recruit members through large group chats. The group even set up tech support channels on Telegram to help members avoid surveillance.
When pressured to shut down these groups—in the midst of horrific beheadings—Durov said:
It’s a series of tragic events but ultimately ISIS will always find a way to communicate within themselves… I don’t think we should feel guilty about this. I still think we’re doing the right thing: protecting our users’ privacy.
Telegram eventually shut down the channels—and instituted a stronger anti-terrorism policy. But it has remained a hub for all kinds of extremist networks—from fascists in the UK to Hamas in the Middle East and QAnon in the US.
Key point to note: Extremist content is not limited to terrorist groups:
Last month, it emerged in the Israeli press that a psychological warfare unit of the IDF, or Israel Defense Forces, had run an unauthorised Telegram Channel entitled “72 Virgins — Uncensored” that posted graphic and violent content, such as videos of corpses, allegedly Hamas members, targeted at its own citizens.
Feature not a bug: Critics of the social media platform argue that the lack of moderation is about revenue not principle. It is far more “lucrative” to let the vile stuff alone:
[Durov] bragged to the Financial Times earlier this year that each user only cost him 70 cents a year to support. Ramping up the teams necessary to respond to law enforcement requests, disrupt networks of CSAM traders and other criminals, and address other content moderation needs would eat into Durov’s profit margin — and potentially disrupt his planned initial public offering.
Telegram has only seven system administrators—who manage more than 80,000 servers around the world. By keeping his team super-lean, Durov boasts they have no choice but to “automate things at an extreme level… As a result, we became so greatly cost efficient.”
Point to note: Telegram’s policy on blocking or restricting content is deliberately inconsistent. Take for example, Hamas channels. Some are only available to those with Android phones—or on the web, but not on Apple devices. As one expert puts it: “We don’t know exactly what is supposed to be taken down. We don’t know which measures apply and when — is a channel taken down fully, is it geoblocked in a certain country, is it only available on the web, not Apple or Android?”
The trickier problem of encryption
Contrary to Durov’s big talk about free speech, Telegram’s privacy protocols are far weaker than rivals like WhatsApp. As a rule, your messages are not encrypted end-to-end—which prevents the platform from seeing their content. In fact, encryption is only available as an ‘opt in’—on something called ‘secret chats’:
The service does offer end-to-end encryption, through a little-used opt-in feature called “secret chats” but, by default, conversations are encrypted only insofar as they can’t be read by any random person connected to your wifi network. To Telegram itself, any messages sent outside a “secret chat” – which includes every group chat, and every message and comment on one of the service’s broadcast “channels” – is effectively in the clear.
Many people are not aware of the small print—which has allowed Durov to “aggressively market” Telegram—falsely implying rivals like Signal and WhatsApp “were backdoored by the US government, and only Telegram’s independent encryption protocols were really trustworthy.”
But, but, but: The secret chats have been just as invaluable for political dissidents battling authoritarian governments—as have features like the self-deleting messages. The same feature that encourages pedophile networks—also protects democratic freedoms. This is where France’s decision to arrest Durov enters dangerous territory:
French law enforcement has long hated encryption. This seems like a potential avenue for them to blame what happens on Telegram at least in part on encryption, when the truth is that the other counts suggest that Telegram’s noncooperation with judicial orders is the real problem.
A worrying bit of evidence: In France, a company has to apply for a licence to import encryption technology—which Telegram apparently failed to do. What’s notable: “Some of the tech companies were surprised by the cryptology charge because it was unclear to them that a licence was needed in France for the technology.” According to the Electronic Frontier Foundation: “France very likely knew for many years that Telegram had not filed the required declarations regarding their encryption, yet they were not previously charged for that omission”—until now. All of which suggests that authorities could be using Telegram to send a warning shot on encryption.
Meanwhile, in India: The government is pressuring WhatsApp to break encryption—supposedly to shut down child sex abuse and disinformation networks. The revised IT rules require all tech companies to maintain “traceability” for messages—so law enforcement can trace a forwarded message to find the person who first sent it. But to do so, WhatsApp will need to ‘see’ all messages:
Tracing the copies back to their source would require building a new layer of surveillance into WhatsApp. “There is no way to predict which message a government would want to investigate in the future,” the company wrote in a blog post in 2021. “To comply, messaging services would have to keep giant databases of every message you send, or add a permanent identity stamp — like a fingerprint — to private messages with friends, family, colleagues, doctors, and businesses.”
In a High Court case challenging the law, the company recently declared it would rather leave the country instead.
Meanwhile, in the UK: This isn’t just a case of BJP overreach. A proposed online safety law in the UK is not all that different:
Under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption. If the company refused to do, it could face fines of up to 4% of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.
WhatsApp has threatened to walk out of the UK, as well—as has Signal.
All for one, one for all: Global tech companies cannot break encryption for just one country, as WhatsApp chief Will Cathcart pointed out in the case of the UK:
It’s a remarkable thing to think about. There isn’t a way to change it in just one part of the world. Some countries have chosen to block it: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.
The bottomline: Digital rights NGOs like EFF are taking a cautious line on Durov’s arrest:
Those running similar services may not have anything to fear, or these charges may be the canary in the coalmine warning us all that French authorities intend to expand their inspection of messaging and social media platforms. It is simply too soon, and there is too little information for us to know for sure.
But the trendlines are crystal clear. Be it Durov’s arrest, the Brazilian ban on X, or new IT laws in India and the UK, a new era of internet regulation is upon us. The only question is whether or not it will include guardrails to protect you, dear user.
Reading List
The Electronic Frontier Foundation has the most detailed breakdown of the charges against Durov. Financial Times (splainer gift link) offers a deep dive into the debate over Telegram and its founder’s business practices. Washington Post is best on the pedophile networks on the platform—while Decode looks specifically on India. New York Times offers the bigger picture on encryption. Platformer makes the case against Telegram as a bad apple—that does not deserve support. Rest of World is best on why WhatsApp is threatening to exit India. MoneyControl reports on investment fraud in India on Telegram.