Setting intentions for our relationship to technology in 2025
View in browser
logo_op-02

January 14th, 2025 // Did someone forward you this newsletter? Sign up to receive your own copy here.

Image by Project Liberty

How technology shapes our perception

 

This is the time of the year when many of us set new intentions about our relationship with our technology. Less time on social media. No phones in the bedroom. Fewer hours scrolling mindlessly. More books.

 

Around this time last year, we published a newsletter on our relationship with technology. It’s worth rereading if you’re looking for practical guidance on how to set better boundaries with your tech in the new year and why it’s so hard to reclaim your attention from the devices and platforms that have been designed to keep us locked in.

 

This year, we are exploring something related but different: how our technology shapes our perceptions of ourselves and society.

 

Our connection to technology is more than how many times a day we pick up our phone (144, to be exact), what social media we frequent, or whether we are a “lurker” or a “poster.” Our technology can become the mirror through which we see ourselves and the world around us.

 

But, how accurate or warped are those mirrors? 

 

The stakes are high: These digital reflections shape not just our self-image, but how we understand and connect with others—in ways that ripple outward into our collective, social reality. With billions of us viewing the world through technological lenses each day, they mold how connected or divided we feel as a society and how strong or weak we perceive our democratic institutions to be.

 

// This is your brain on algorithms

The adage, "You are the average of the five people you spend the most time with" might need reconsideration in the digital age, particularly as Americans spend more time alone.

 

Today, we are more likely to be the average of the five platforms we spend the most time on or the five algorithms that inform our beliefs.

  • A study last month found that Gen Z thinks they need $500,000 a year to succeed (far more than the amount other generations believe is necessary to be successful). One hypothesis for this intergenerational disparity is the influence of social media. It can portray an aspirational life of luxury, encouraging young people to benchmark their idea of success to an unattainable ideal found online. This is not new to social media—television has done this for years—but the level of algorithm-driven personalization and monetization is more powerful than ever.
  • A study from 2021 found that the biggest predictor of a social media post going viral was a post about an out-group (i.e., the people who don’t belong to the in-group of people viewing the post). Posts about the opposing party were twice as likely to go viral as posts about one’s party, and the majority of those posts were negative. In short, according to the researchers, “out-group hate was a stronger driver of virality than in-group love.” This engagement-driven virality amplifies perceptions of our own righteousness while reinforcing negative views of those with whom we disagree.

 

//

“If ‘you are the media,’ then there is no longer a consensus reality informed by what audiences see and hear: Everyone chooses their own adventure.”

//

 

// Filter bubbles

It’s no secret that we live in filter bubbles. The concept (with a book by the same name) was coined by Eli Pariser over a decade ago, now a co-leader of Project Liberty Alliance partner, New_ Public. It refers to the personalized ecosystem of information that’s been catered and curated by algorithms.

 

The content we are fed and the filter bubbles we inhabit are upstream of our perception of self and perceptions of the world around us. And today, those filter bubbles vary based on who we are. 


For example, during the 2024 U.S. Presidential election, an analysis by The Washington Post found that on TikTok, women across all political persuasions saw Harris campaign videos 40% more often than men. Meanwhile, men saw Trump campaign videos more than 2x as often as women did.

 

// The internet: no one knows what's going on 

When our algorithms curate a personalized experience just for us, there are fewer shared, collective experiences. An article by Charlie Warzel in The Atlantic in 2023 explored the idea that no one knows what’s happening online anymore because algorithms have segmented the internet into a million different internets. 

 

People can spend hours on TikTok every day, but completely miss the most viewed TikTok videos. As Warzel wrote, “The very idea of popularity is up for debate: Is that trend really viral? Did everyone see that post, or is it just my little corner of the internet?”

 

Warzel wrote again in November in The Atlantic on the same topic of the internet’s fragmentation. This time, it was in the context of the U.S. Presidential election, which has been called “the podcast election” because of how candidates chose to bypass the mainstream media for other independent media firmly situated within the filter bubbles of their target constituents.

 

The day after the election, Elon Musk posted on X, “You are the media now.”

 

As a platform, X is vested in encouraging its users to consider themselves the media and post frequently. But Musk may be tapping into a deeper truth about how the modern internet works: the shared, collective experiences and truth-anchors of the past are being replaced with personalized, niche experiences driven by preferences, virality, and algorithms.

 

Warzel extrapolated the idea further: “If ‘you are the media,’ then there is no longer a consensus reality informed by what audiences see and hear: Everyone chooses their own adventure.”

 

// A new form of digital literacy in the new year

This new choose-your-own-adventure reality might give us too much credit to do the choosing. It is often the algorithm that does the choosing for us. But it does invite us to reconsider our own agency. 

 

In the same way that at the beginning of each new year we might exercise our agency to have a different relationship with our technology and devices, this year we might consider exercising our agency in ways that make us more humble and more skeptical of how the algorithms shape our worldview and our sense of self.

 

This is the heart of true digital media literacy. Beyond just being able to identify fact from fiction online, a fragmented, algorithm-driven internet should invite us into a posture of humility: perhaps I’m not seeing everything here. Perhaps, in this new year, I can expand the influences that shape me beyond the algorithms and devices that think they know me.

 

// Four actions you can take

Here are four actions you can take to kick off 2025.

  1. Keep an Algorithm Diary: For a short period, track the content that platforms show you and how it impacts your emotions. Note patterns and identify content designed to provoke strong reactions to build self-awareness of algorithmic influence.
  2. Practice Bubble Hopping: Once a month, seek out thoughtful perspectives outside your usual views. Follow respected thinkers across the spectrum and aim to understand, not just critique, differing opinions.
  3. Pause for a Reality Check: Before reacting or sharing content, pause and ask, “Does this reflect reality?” “What outcome am I hoping for by posting?” Verify with sources, check coverage across outlets, and ensure you're seeing the full picture. Check out ReThink, a technology that stops online hate before the damage is done.
  4. Curate Your Algorithms and Attention: Explore how you can shape the algorithms themselves—through changing privacy settings, customizing your feed (on X you can change from a feed that optimizes for engagement to one that simply shares tweets in chronological order), or switching to a new platform entirely that offers you more control. Finally, take control of your information diet by creating lists of trusted sources and dedicate more time to deeper, algorithm-free reading.

The People's Bid in the news

// Project Liberty Founder Frank McCourt Jr. was featured on a video segment on Bloomberg. Watch here.

Other notable headlines

// 👨‍🚒 Watch Duty’s wildfire tracking app has become a crucial lifeline for LA. "We view what we are doing as a public service," the cofounder of the nonprofit said, according to an article in The Verge.

 

// 🛡 The US still has no federal privacy law. But recent enforcement actions against data brokers may offer some new protections for Americans’ personal information, according to an article in MIT Technology Review.

 

// 🏫 Sensitive data belonging to students and teachers across several K-12 school districts appears to have been stolen in a recent breach of a major education technology provider, according to an article in Axios.

 

// 🕵️‍♀️ Global fact-checkers were disappointed, but not surprised, when Meta ended its content moderation programs, according to an article in Rest of World. Groups in Pakistan, Argentina, and Brazil have been diversifying revenue streams.

 

// 🏛 The fate of TikTok now rests in the hands of the US Supreme Court. If a law banning the social video app this month is upheld, it won’t disappear from your phone, but it will get messy fast, according to an article in WIRED.


// 🤔 What happens when sprawling online communities fracture into politically homogeneous, self-governing communities? An article in Noema Magazine explored the great decentralization.

Partner news & opportunities

// Webinar on AI & the future of work

Today, January 14 at 12pm ET

Kick off the year with an insightful webinar featuring Erik Brynjolfsson and Tom Mitchell, the two co-chairs of the National Academies report on Artificial Intelligence and the Future of Work. Explore AI’s transformative potential, from productivity gains to workforce challenges, and join a live Q&A to engage with the experts. Register here.

 

// Introducing the Bridging Dictionary: A Tool for Reducing Polarization

Our partners at MIT’s Center for Constructive Communication (CCC) have launched the Bridging Dictionary, an innovative prototype that uses AI to explore how language differs across political divides. By analyzing thousands of transcripts and articles from Fox News and MSNBC, the tool identifies polarized words and suggests less divisive alternatives to foster better understanding. Read more about it here.

Join The People's Bid

Last week The Supreme Court heard oral arguments on the potential TikTok ban. Catch up on everything you need to know in this USA Today primer that mentions The People's Bid. Project Liberty continues its work with TikTok creators like Sean Szolek-VanValkenburgh to promote The People's Bid. Like and repost Sean's video and join the bid if you haven't yet!

What did you think of today's newsletter?

We'd love to hear your feedback and ideas. Reply to this email.

/ Project Liberty builds solutions that help people take back control of their lives in the digital age by reclaiming a voice, choice, and stake in a better internet.

 

Thank you for reading.

Facebook
LinkedIn
Sin título-3_Mesa de trabajo 1
Instagram
Project Liberty footer logo

10 Hudson Yards, Fl 40,
New York, New York, 10001
Unsubscribe  Manage Preferences

© 2024 Project Liberty LLC