April 23, 2024 // Did someone forward you this newsletter? Sign up to receive your own copy here.
5 insights on the future of Trust & Safety
Trust & Safety is one of the most important and misunderstood fields in today’s tech landscape.
It employs thousands of workers globally, and its efforts are crucial to minimizing online harms and striving to make tech platforms safe and healthy places.
Trust & Safety efforts built into everyday platforms are often hidden from view. Yet they’re garnering more attention given the importance and complexity of the tasks, and the toll the work is taking on the workforce.
This week, we’re demystifying Trust & Safety by highlighting five insights about the field from our conversation with Eli Sugarman, a Senior Fellow at the Special Competitive Studies Project.
Previously, Sugarman was Vice President of Content Moderation at the Meta/Facebook Oversight Board, and Director of the Cyber Initiative at the William and Flora Hewlett Foundation, where he led a ten-year, $160 million grant-making effort to build the field of cyber policy.
// What is Trust & Safety?
Sugarman defines Trust & Safety as the efforts of companies and civil society to 1) minimize harms and risks, and 2) promote positive impacts in online spaces and across online products.
Trust & Safety applies to a wider range of technologies, including:
Talent like the Trust & Safety teams inside of social media companies like Meta, X, TikTok, and Bluesky, but also on other platforms like Venmo, Etsy, and Uber.
Here are five insights from Sugarman about this fast-moving and misunderstood space.
// Insight #1: Trust & Safety at big tech platforms is complex
For big tech platforms like Meta, Google, and TikTok, Sugarman emphasized how complex the Trust & Safety effort is, and how there are both opponents and allies within these companies.
Complicated operations: Many big tech platforms have built robust Trust & Safety operations with a byzantine web of policies, vendors, and thousands of employees (often through contractors). However, the foundations of these systems were often built years ago before the platforms became huge. As these cobbled-together systems have scaled to a level no one imagined, they’ve begun to break, even as companies continue to invest billions.
Imperfect execution: Facebook has three billion monthly active users and people watch over 1 billion hours of YouTube daily. These platforms get criticized for not being perfect, Sugarman said, but moderating content on global platforms is hard work, and bad-faith actors will bypass safeguards or game policies to ensure they go undetected.
While the big tech platforms get all the press, Sugarman believes philanthropy should direct its attention and its dollars to smaller platforms.
// Insight #2: The biggest opportunity is with small and midsize platforms
While governments might be best positioned to reform the biggest tech companies through regulation, philanthropy can play a role in shaping the next generation of tech companies.
Small and midsize tech companies are in the early stages where Trust & Safety can still be baked into their DNA. On the smallest platforms, often the only person thinking about Trust & Safety is a community moderator or a team of 2-3 part-time volunteers. Investing in smaller platforms creates the opportunity to design for scale from the beginning, Sugarman said, instead of trying to shift policies, update org charts, and mend already-broken systems in much larger companies.
Sugarman believes a suite of solutions is necessary:
Build and maintain open-source tools, model policies, and other resources, especially for small and midsize platforms.
Create a global network of Trust & Safety education and research centers that produce a diverse and capable talent pipeline, deliver impactful research, and better define the parameters of the field.
Shape emerging regulations to change the behavior of platforms for the better.
Make a compelling business case for greater investment in Trust & Safety within companies and by private capital into the Trust & Safety vendor/start-up market.
Treat Trust & Safety as a core business function with a "seat at the table" to help make key business decisions for a company.
//
The approach to Trust & Safety needs to shift from attempting to detect and moderate content to identifying the actors and behaviors behind the content.
//
// Insight #3: We need to shift the approach to Trust & Safety
The approach to Trust & Safety needs to shift from attempting to detect and moderate content to identifying the actors and behaviors behind the content. Instead of fixating on the content itself, teams are uncovering actors and behaviors across platforms to understand where malicious content might show up next. Otherwise, content moderation is an endless game of whack-a-mole.
This means the expertise necessary for effective Trust & Safety is evolving—moving away from writing policies that detect content to policies that attempt to map the underlying behaviors that lead to harmful content, like conducting more extensive data analysis to search for signals and patterns within the profiles and behaviors of users; when was this account created? How many similar accounts have been created? Are they working as a coordinated effort to spread disinformation?
// Insight #4: We need to grow the talent pipeline in Trust & Safety
Today’s global expertise in Trust & Safety has been forged through hard-won experience over time, but Sugarman still sees a talent shortage in the Trust & Safety field. “We need more university education in Trust & Safety. It needs to become a proper field with a rigorous, regimented academic pathway from community colleges to elite universities.”
There is momentum at universities like Stanford and Columbia, and there’s a Journal of Online Trust and Safety, but it’s still early days, according to Sugarman. The high-profile layoffs of Trust & Safety teams at major tech companies are not a fair depiction of the global demand for workers in this field—demand that extends beyond the US and Europe into every country.
// Insight #5: We're in a narrow window of time
A report released last year argued that there’s a narrow window of time to build the next generation of tech companies with Trust & Safety in their DNA.
Decentralization & fragmentation: More and more people are migrating away from the big tech platforms to smaller alternatives (like platforms in “the fediverse”). This trend will only accelerate, and the number of small platforms will grow, creating an opportunity to influence emerging platforms before the number of platforms outpaces the capacity to influence them.
The bite of regulation: The smaller, early-stage platforms are watching as governments crack down on bigger platforms, and they know that regulators will focus on them next, so they’re rushing to get their Trust & Safety operations in order.
Good for business: Platforms of all sizes are beginning to recognize that “Trust & Safety can be good business,” Sugarman said. Neglecting it carries reputation risk, but successful Trust & Safety efforts can drive major financial performance, as the recent IPO of Reddit demonstrated.
AI technology: Generative AI is a force-multiplier for the Trust & Safety field. It can generate harmful content at scale, but it also can be a first-line of defense in identifying unsafe content, behaviors, and actors. Harnessing this technology and preventing those from using it maliciously will only get more challenging, Sugarman said.
// Building the field for the future
Sugarman was involved in the early days of building the cybersecurity field a decade ago, and he sees similarities today with Trust & Safety. He hopes the field will grow as meteorically as the cybersecurity field did, with billions of dollars invested every year.
"Everyone who wants AI and other emergent digital technologies to make the world a better place—instead of one beset by myriad harms—is a natural ally and supporter of a more robust and capable Trust & Safety field. They just may not know it yet."
Project Liberty in the news
// Last week we hosted the inaugural Project Liberty Institute Summit: Toward a New Civic Digital Infrastructure, with both the Berkman Klein Center for Internet & Society at Harvard University and the MIT Center for Constructive Communication in Cambridge, MA. The event brought together an expansive network of technologists, policymakers, academics, civil society leaders, entrepreneurs, and governance experts for engaging and productive discussions. More to come in the following weeks!
// Project Liberty’s Amplica Labs announced the acquisition of Speakeasy's pioneering AI platform for improving digital discourse. This acquisition marks a significant step forward in addressing the pressing issues plaguing online conversations today. Read more here.
Other notable headlines
// 🚢 An article in The Verge explored the invisible seafaring industry that keeps the internet afloat by tending to the fiber optic cables along the sea floor.
// 🌳 “We need to rewild the internet,” according to an article in Noema Magazine. The internet has become an extractive and fragile monoculture. But we can revitalize it using lessons from ecologists.
// 🚨 An article in MIT Technology Review explored how AI was supposed to make police bodycams better, but hasn't delivered on that promise.
// 🎒 According to an article in the Wall Street Journal, students’ phone use is disruptive, but teachers and administrators are facing an unlikely opponent: parents.
// 🧠 The US took its first big step toward protecting your brain’s privacy. An article in Vox highlighted how Colorado passed legislation to prevent companies from selling your brainwaves.
// 🕵 A guide in The Markup provided insights into spotting audio and video deepfakes from a professor who’s studied them for two decades.
// 🗳 As two billion people in 50 countries head to the polls this year, an article in Rest of World tracked the most noteworthy incidents of AI-generated election content.
// Webinar on early childhood mental health and digital media
May 1st at 12pm ET
Children and Screens will host a webinar to explore how digital media impacts young children’s emotional, sensory, and relationship development. Register here.
// In-person event on suing social media platforms