Thanks to the proliferation of social media platforms, information has become a readily available commodity in modern society. Yet, as Professor of Philosophy at Scripps College Yuval Avnur points out, it often seems as if this “age of information” has caused more problems than it has solved. In an interview with The Nobleman, Avnur said, “Your generation has access to more information than any human generation has ever had, and yet, people seem more polarized than ever before. This is puzzling because you would think that if everyone had access to more information, then there’d be fewer disagreements.” Why do we see so much polarization in modern society if we have access to more information than ever before? According to Avnur, the root cause may lie in a phenomenon called the “online echo chamber.”
Many students at Nobles share similar concerns with Avnur, but most are unfamiliar with the concept of echo chambers. Instead, some students point to misinformation as a potential cause of these societal issues. “I feel like the majority of the younger generation receives their news on social media. This can definitely cause problems because it’s way easier to spread fake information on social media than on an actual news source,” Joza Wang (Class IV) said. Expressing a similar sentiment, Riya Jain (Class II) said, “I do think getting your news from social media can be problematic if it doesn’t show you the full story.”
Misinformation and misrepresentation are undoubtedly significant problems—especially given the recent rollback of fact-checking measures on platforms such as X. However, Professor Avnur argues that echo chambers pose a far greater threat, as they not only affect the information we see online but also how we process information both online and in the real world.
The crux of Avnur’s argument is that algorithmic recommendation systems and friend networks on social media expose users primarily to content from like-minded sources, thus reinforcing their preexisting beliefs. This constant reinforcement is problematic because it makes a user less responsive to reason or new evidence that contradicts their views. “Through the feed[…]one is being schooled on how to reach or maintain a belief or inclination in light of the current evidence, whatever the evidence is,” Avnur wrote in a 2020 paper published in the Journal of Applied Philosophy.
“Through the feed[…]one is being schooled on how to reach or maintain a belief or inclination
in light of the current evidence,
whatever the evidence is.”
Avnur further clarifies this idea by drawing a parallel to how cult members are indoctrinated into absurd ways of thinking through the reinforcement of certain narratives and isolation from any ideas that oppose these narratives. After enough reinforcement, cult members become completely entrenched in irrationality and oftentimes require psychiatric “deprogramming” to rebuild a capacity for critical thinking. Cult indoctrination is an extreme version of the echo chambering that occurs online.
According to Avnur, social media use makes people less rational in how they form their beliefs, which naturally prompts questions about how young, impressionable students should be engaging with these platforms. Are online echo chambers a problem at Nobles?
Notably, students seem to engage with social media algorithms quite differently, which would have an effect on the degree to which they are exposed to echo chambers. Some students engage more with certain kinds of content while ignoring others. “I usually just scroll past content that I disagree with, and I think that the algorithm kind of picks up on that and ends up showing me less of it,” Mack Smink (Class II) said. On the other hand, some students make a conscious effort to not let the algorithm dictate their exposure to content. “I try not to interact with any opinionated posts because I don’t want my algorithm to sway either way. I kind of want to see everything,” Kailynn Zheng (Class II) said.
As a result of these different engagement patterns with the algorithm, some students reported that they did receive content from like-minded sources—perhaps suggesting the presence of an echo chamber—while others reported a diversity of content on their feed. “During the election, I would get so many videos from ‘Kamala HQ,’ or whatever, and then I would scroll five times and the next video would be from ‘Trump HQ.’ I feel like the algorithm wasn’t really representative of my views at all. Even now, on Instagram, I’ll see so many different news agencies like Fox and NBC,” Rayan Salamipour (Class II) said.
“During the election, I would get so many videos from ‘Kamala HQ,’ or whatever, and then I would
scroll five times and the next video would
be from ‘Trump HQ.’
Quite interestingly, some students even end up with feeds dominated by views that they disagree with. “If I were to name to top three [political influencers] I’ve seen on my feed, it would probably be Ben Shapiro, Charlie Kirk, and Candace Owens. They’re all super Republican. I think that both sides [of the political spectrum] are not equally represented because the liberal side of social media just doesn’t attract as much attention—they don’t really have the same intensity or charisma, or events as the Republican side of social media,” an anonymous Class III student said. Social media algorithms prioritize engagement, meaning that polarizing or high-energy content might have a greater presence on a given feed, regardless of a user’s opinions.
While echo chambers do not seem like a particularly universal problem at Nobles, it is still vital for students to remain aware of how social media can shape beliefs. In a word of advice to Nobles students, Avnur said, “Try to remember that platforms like Instagram, Twitter, Facebook, and TikTok are not sources of information; they are entertainment. They are designed to keep your eyeballs on the screen for as many nanoseconds as possible. That’s the most important thing to remember when using these apps: social media is not for learning.”
The effects of echo chambers can be entirely subliminal, so students should push themselves to constantly question their own biases. “I’ve stopped taking any of my opinions for granted because it’s so obvious to me now that I don’t read into stuff as much as I should. I form stances pretty quickly, which is something that’s quite universal nowadays[…]I’m also pushing myself to try to hear people out more. Even if your stance seems crazy, I’m not going to count you out for believing that,” Ava Neal (N ’23), a graduate of Professor Avnur’s class, said.