How technology shapes our perception
This is the time of the year when many of us set new intentions about our relationship with our technology. Less time on social media. No phones in the bedroom. Fewer hours scrolling mindlessly. More books.
Around this time last year, we published a newsletter on our relationship with technology. It’s worth rereading if you’re looking for practical guidance on how to set better boundaries with your tech in the new year and why it’s so hard to reclaim your attention from the devices and platforms that have been designed to keep us locked in.
This year, we are exploring something related but different: how our technology shapes our perceptions of ourselves and society.
Our connection to technology is more than how many times a day we pick up our phone (144, to be exact), what social media we frequent, or whether we are a “lurker” or a “poster.” Our technology can become the mirror through which we see ourselves and the world around us.
But, how accurate or warped are those mirrors?
The stakes are high: These digital reflections shape not just our self-image, but how we understand and connect with others—in ways that ripple outward into our collective, social reality. With billions of us viewing the world through technological lenses each day, they mold how connected or divided we feel as a society and how strong or weak we perceive our democratic institutions to be.
// This is your brain on algorithms
The adage, "You are the average of the five people you spend the most time with" might need reconsideration in the digital age, particularly as Americans spend more time alone.
Today, we are more likely to be the average of the five platforms we spend the most time on or the five algorithms that inform our beliefs.
- A study last month found that Gen Z thinks they need $500,000 a year to succeed (far more than the amount other generations believe is necessary to be successful). One hypothesis for this intergenerational disparity is the influence of social media. It can portray an aspirational life of luxury, encouraging young people to benchmark their idea of success to an unattainable ideal found online. This is not new to social media—television has done this for years—but the level of algorithm-driven personalization and monetization is more powerful than ever.
- A study from 2021 found that the biggest predictor of a social media post going viral was a post about an out-group (i.e., the people who don’t belong to the in-group of people viewing the post). Posts about the opposing party were twice as likely to go viral as posts about one’s party, and the majority of those posts were negative. In short, according to the researchers, “out-group hate was a stronger driver of virality than in-group love.” This engagement-driven virality amplifies perceptions of our own righteousness while reinforcing negative views of those with whom we disagree.
//
“If ‘you are the media,’ then there is no longer a consensus reality informed by what audiences see and hear: Everyone chooses their own adventure.”
//
// Filter bubbles
It’s no secret that we live in filter bubbles. The concept (with a book by the same name) was coined by Eli Pariser over a decade ago, now a co-leader of Project Liberty Alliance partner, New_ Public. It refers to the personalized ecosystem of information that’s been catered and curated by algorithms.
The content we are fed and the filter bubbles we inhabit are upstream of our perception of self and perceptions of the world around us. And today, those filter bubbles vary based on who we are.
For example, during the 2024 U.S. Presidential election, an analysis by The Washington Post found that on TikTok, women across all political persuasions saw Harris campaign videos 40% more often than men. Meanwhile, men saw Trump campaign videos more than 2x as often as women did.
// The internet: no one knows what's going on
When our algorithms curate a personalized experience just for us, there are fewer shared, collective experiences. An article by Charlie Warzel in The Atlantic in 2023 explored the idea that no one knows what’s happening online anymore because algorithms have segmented the internet into a million different internets.
People can spend hours on TikTok every day, but completely miss the most viewed TikTok videos. As Warzel wrote, “The very idea of popularity is up for debate: Is that trend really viral? Did everyone see that post, or is it just my little corner of the internet?”
Warzel wrote again in November in The Atlantic on the same topic of the internet’s fragmentation. This time, it was in the context of the U.S. Presidential election, which has been called “the podcast election” because of how candidates chose to bypass the mainstream media for other independent media firmly situated within the filter bubbles of their target constituents.
The day after the election, Elon Musk posted on X, “You are the media now.”
As a platform, X is vested in encouraging its users to consider themselves the media and post frequently. But Musk may be tapping into a deeper truth about how the modern internet works: the shared, collective experiences and truth-anchors of the past are being replaced with personalized, niche experiences driven by preferences, virality, and algorithms.
Warzel extrapolated the idea further: “If ‘you are the media,’ then there is no longer a consensus reality informed by what audiences see and hear: Everyone chooses their own adventure.”
// A new form of digital literacy in the new year
This new choose-your-own-adventure reality might give us too much credit to do the choosing. It is often the algorithm that does the choosing for us. But it does invite us to reconsider our own agency.
In the same way that at the beginning of each new year we might exercise our agency to have a different relationship with our technology and devices, this year we might consider exercising our agency in ways that make us more humble and more skeptical of how the algorithms shape our worldview and our sense of self.
This is the heart of true digital media literacy. Beyond just being able to identify fact from fiction online, a fragmented, algorithm-driven internet should invite us into a posture of humility: perhaps I’m not seeing everything here. Perhaps, in this new year, I can expand the influences that shape me beyond the algorithms and devices that think they know me.
// Four actions you can take
Here are four actions you can take to kick off 2025.
- Keep an Algorithm Diary: For a short period, track the content that platforms show you and how it impacts your emotions. Note patterns and identify content designed to provoke strong reactions to build self-awareness of algorithmic influence.
- Practice Bubble Hopping: Once a month, seek out thoughtful perspectives outside your usual views. Follow respected thinkers across the spectrum and aim to understand, not just critique, differing opinions.
- Pause for a Reality Check: Before reacting or sharing content, pause and ask, “Does this reflect reality?” “What outcome am I hoping for by posting?” Verify with sources, check coverage across outlets, and ensure you're seeing the full picture. Check out ReThink, a technology that stops online hate before the damage is done.
- Curate Your Algorithms and Attention: Explore how you can shape the algorithms themselves—through changing privacy settings, customizing your feed (on X you can change from a feed that optimizes for engagement to one that simply shares tweets in chronological order), or switching to a new platform entirely that offers you more control. Finally, take control of your information diet by creating lists of trusted sources and dedicate more time to deeper, algorithm-free reading.