How Algorithms Replaced Human Editors: Yuval Harari’s Warning
In his recent book Nexus, Yuval Noah Harari suggests that we misunderstand how culture is shaped—not just online today but throughout history.
In his telling, we tend to give the creators of culture—the authors, musicians, philosophers, artists, and contemporary influencers and content creators—too much credit.
Yes, creators play an essential role, Harari concedes. But it is the editors that shape culture.
They are the ones who decide what gets included and excluded, what gets published and what gets left on the cutting room floor. They shape what has the chance to enter into the zeitgeist and what falls away into obscurity.
For centuries, the editing role has been in human hands—shaped by their power, and at times, influenced by their biases and preferences.
But today, as Harari points out, much of that editorial power has shifted to technology. On Instagram, X, or TikTok, it’s no longer human editors making the calls. Instead, a complex web of algorithms—trained often on opaque criteria—determines what gets amplified and what fades into the void.
We are downstream of our algorithms. They don’t as much reflect culture. They shape the culture that shapes us.
If all this is true, how might we move upstream to take greater agency over the information we engage with? Is the solution simply consuming more selectively, or is it being more intentional in crafting our digital environment?
In this week’s newsletter, we explore the idea that our algorithms are editors that shape our experience and what we can do about it.
// Globalizing sameness
Algorithms don’t just direct the attention of a society but influence buying patterns and individual preferences. It can be hard to tease apart what we genuinely like and what algorithms have told us to like.
- Homogenous listening: After a year of listening to music and podcasts on Spotify, the music streaming service will share each user’s Spotify Wrapped, a personalized summary of their most listened-to songs, creators, and genres. But as Tiffany Ng wrote about in MIT Technology Review last year, Spotify’s algorithm herds us into familiar listening patterns and makes it harder to discover new music.
- Algorithm-designed spaces: In the book Filterworld, author Kyle Chayka explains how social media algorithms “flatten” and homogenize our culture by making decisions for us. All of a sudden, we all like the same things and have the same taste.
// Editors throughout history
Editors have always played an important role in society. In the 4th century, the Catholic Bishops who gathered in Carthage (modern-day Tunisia) for the Councils of Carthage decided what books should be included in the Bible.
Last month, for its 100th anniversary, The New Yorker ran an article about the role of the invisible editor in shaping the articles that ultimately made it onto its pages. At The New Yorker, the editors possess extraordinary power.
In modern times, the producers who oversaw the big three network news channels in the 20th century (NBC, ABC, and CBS) disproportionately influenced the cultural zeitgeist. How the Vietnam War was covered had bearings on how divided and charged the nation became in the 1960s and 1970s.
What receives our attention has always been shaped by gatekeepers. The difference today? The gatekeepers are no longer human.
// Algorithms: The new editors
With algorithms as our modern-day editors, core issues emerge:
- Algorithms, like people, are biased. In previous editions of this newsletter, we've explored the risks of algorithmic biases in housing, healthcare, education, and facial recognition.
- Algorithms are upstream from our attention. Chayka’s book Filterworld explains how today’s creators—from interior designers to musicians—are more inclined to optimize for the engagement of an algorithm than for the engagement and interaction of humans themselves.
- Algorithms are opaque. Many algorithms are not transparent about how they work or what types of content they reward and promote.
// Reclaiming choice
Our mission at Project Liberty is to build solutions that help people regain control of their digital lives by reclaiming a voice, choice, and stake in a better internet.
It’s hard to entirely extricate ourselves from the influence of algorithms, but the effort to reclaim control and exercise more choice is unfolding in multiple realms.
- Policy & regulation around algorithmic transparency: You can’t shape what you don’t understand, and today, many algorithms are black boxes controlled by tech companies unwilling to open the hood and show how their algorithms work. Audits of algorithms can help. The EU is a leader in regulation with its 2024 AI Act, which requires that “AI systems are developed and used in a way that allows appropriate traceability and explainability, while making humans aware that they communicate or interact with an AI system, as well as duly informing deployers of the capabilities and limitations of that AI system and affected persons about their rights.” In the U.S. in 2022, senators proposed an Algorithmic Accountability Act, but it hasn’t yet been passed into law.
- New platforms that give users more control: Platforms like Bluesky allow users to choose which algorithms they want to drive their experience on the platform. In a 2023 post on algorithmic choice, Bluesky CEO Jay Graber wrote, “We want a future where you control what you see on social media. We aim to replace the conventional ‘master algorithm,’ controlled by a single company, with an open and diverse ‘marketplace of algorithms.’”
- Consumers demanding change: In 2022, widespread protests from Instagram users about changes to the algorithm and design of Instagram led the platform to walk back changes.
- Consumers finding workarounds to manipulate the algorithm: Some platforms allow users to select chronological feeds as opposed to ones oriented around engagement. On X, users can select different feeds. On YouTube, viewers can manage their recommendations by selecting videos they’re not interested in.
In the book Nexus, Harari suggests that culture doesn’t just happen. It is engineered by those who control the flow of information. Today, that power rests in the hands of algorithms. But technology isn’t destiny. Just as humans once shaped the editorial gatekeeping of the past, we still have the ability to push for transparency, demand choice, and build alternatives to today’s algorithmic editors. It’s still up to us to be the ultimate editors.