Can tech leaders be held personally liable?
Is it possible that Meta’s Mark Zuckerberg or X’s Elon Musk could face prison time for what happens on their platforms?
It might seem like an outlandish question, but after Pavel Durov, the founder of Telegram, was arrested in France last month, it’s now something that experts are pondering.
Kate Klonick, an associate professor at St. John’s Law School who researches EU regulation of online platforms, said, “If I was a betting person, I would say that there will be a day that Elon Musk is on trial or be in prison in some country because of his refusal and his thumbing the nose at the rule of law.”
When his private jet touched down in France last month, Durov was arrested over allegations that he was complicit in criminal activity on Telegram, the encrypted messaging platform he runs.
Durov spent 96 hours in jail before being released on bond and charged (he now cannot leave the country). His lawyers claim the criminal case against him is “absurd.”
In this week’s newsletter, we explore whether founders of tech platforms can be held accountable for the content on their sites. What personality liability could they face? What historical precedents exist? And during an era of increased scrutiny of big tech, what does actual accountability look like?
//
Durov would not be the first tech executive to be held responsible for what transpired on a platform.
//
// Durov & Telegram
French authorities claim that Durov, the founder and CEO of Telegram (and a French citizen), is personally responsible for the illegal activity taking place on his platform. There are three main charges:
- The first charge is that Durov is complicit in the crimes taking place on Telegram by failing to moderate illegal activity. According to the Public Prosecutor's Office in Paris, this includes facilitating illegal transactions, money laundering, and “organized distribution of images of minors with a child pornographic nature, drug trafficking, organized fraud, and conspiracy to commit crimes or offenses.” Of note, a Stanford Internet Observatory investigation found that Telegram “implicitly allows the trading of CSAM in private channels.”
- The second charge is that Durov and Telegram failed to cooperate with law enforcement. Laure Beccuau, the Paris prosecutor, told WIRED there was an “almost total lack of response from Telegram to legal requests.” Durov, who has not done what other executives have done (plaintively gone before lawmakers or signaled an effort to comply), has cultivated an anti-authoritarian image and ignored requests from multiple governments to take down content or share user data with third parties.
- The third main charge is that Telegram did not register its encrypted platform for use in France without prior authorization and registration. If Telegram, which encrypts its users’ messages, faces legal scrutiny for its use of encryption, it could raise larger questions about government surveillance of an individual’s privacy and violations of free speech.
Durov’s arrest represents another attempt by governments to crack down on tech platforms for the harm caused online. From the congressional hearings about social media in January of this year to the recent banning of X in Brazil, the pendulum seems to have swung from a laissez-faire approach to tech regulation to heightened scrutiny and action.
The journalist Will Oremus put it like this in the Washington Post, “The crackdowns, which come months after the United States passed a law that could lead to the banning of TikTok, herald the end of an era. Not the social media era, which is still going strong. But the era in which tech titans enjoyed free rein to shape the online world — and a presumption of immunity from real-world consequences.”
The Durov arrest has also raised concerns about encroachments on free speech online. Regulators in both the EU and the US must find a narrow line between moderating content and censoring it. Telegram has branded itself as a haven for free speech and unmoderated content, which has put it in the crosshairs of EU regulators seeking to enforce some of the strictest content moderation policies in the world.
// Can leaders be personally liable?
Regardless of whether we’ve officially entered a new era of greater tech regulation, the question remains, can tech leaders be personally liable?
The short answer is...it depends (and it's nuanced).
- In the US, Section 230 grants tech platforms broad immunity and prevents them from being liable for harmful speech on their platforms.
- The EU is stricter about content moderation on its platforms, but the bar is still high to prosecute individuals for any on-platform crimes.
- Last year, the UK passed the Online Safety Act, a law which, among other things, holds tech leaders personally responsible if their company fails to remove harmful content after being notified that it risks child safety.
Durov would not be the first tech executive to be held responsible for what transpired on a platform.
- Felix Somm, an executive at CompuServe, an internet service provider, was convicted in Germany in 1998 for his failure to block access to child pornography sites. He was later acquitted.
- Timothy Koogle, a former chief executive of Yahoo, faced charges in France in 2002 for the sale of Nazi memorabilia on the website. He was also acquitted.
- Ross Ulbricht, the founder of Silk Road, an online black market focused on illicit goods and drugs, was convicted by a US federal court in 2015 for facilitating drug sales. He is currently serving a life sentence.
// Hard to prove
Daphne Keller, a professor of internet law at Stanford University Law School, pointed out that prosecutors face an uphill battle in proving that a tech executive had knowledge of illegal activity on their platform and failed to address it.
This is particularly true as platforms like Meta, TikTok, and Google have attempted to take down harmful content (they have been far from perfect). For Durov, his unwillingness to comply has likely made him a bigger target.
// The bigger question
Regardless of what happens with Durov (you can follow Tech Policy Press’s “Durov Arrest Tracker” here), there’s a bigger question about what true accountability looks like and if that accountability will lead to a safer web.
Even if tech leaders can be held criminally liable, is that a more effective approach to building a better web than holding their companies responsible and requiring them to pay fines, break up business units, or reform practices?
For those committed to building safer, more responsible tech, accountability for harms is a necessary step. But it might not be sufficient. We need to go beyond limiting harms to proactively building the infrastructure for a safer, more accessible web.
Perhaps the greatest accountability is to return the platforms to the people and create mechanisms of shared governance and control in the digital age.