Skip to Content
Categories:

Filtered Realities: Examining the Invisible Influence

Photo courtesy of Google DeepMind
Photo courtesy of Google DeepMind
Abstract

Social media platforms have increasingly become a primary source of news for many people yet much of the content displayed is curated by algorithms based on individual preferences. This reinforcement creates a high potential for the formation of echo filter bubbles – potentially hindering the understanding of complex topics or events. This study explores how exposure to algorithmically curated content on social media (news, politics, health, entertainment) can impact a person’s accuracy and understanding of said topics. Through both content analysis of user interactions, surveys, and interviews, we can gain insight into this complex field. We will examine the methods behind social media algorithms, the interplay with selective exposure, and the dynamic system they create. While social media offers diverse viewpoints and perspectives from all over the world, it can also limit these same factors and prioritize viral content over in-depth analysis. This paper will deepen the understanding of how algorithms curate content, how companies and content creators keep up in a shifting environment, and how all of this affects the end user. 

 

Introduction

This study aims to further investigate the impact of social media algorithms on the media individuals receive, potentially affecting their understanding of current events: primarily focusing on topics such as politics, health, science, and entertainment. The relevance of this study becomes increasingly prominent as social media becomes a primary source of news and information for a large percentage of the population. With little oversight on most social media and other information platforms, it raises concerns about the potential influence of algorithmic filters on public perception and understanding of important issues (Kitchens et al., 2020).

 

The new knowledge gained through this study will help contribute to a deeper understanding of the role algorithms play in shaping our lives, both online and off. Gaining a deeper understanding is important because the impact it produces can have significant implications on one’s decision-making, media literacy, and overall mental well-being. By shedding more light on these dynamics this study hopes to promote greater media literacy and overall knowledge on the different forces influencing our understanding. 

Photo courtesy of Google DeepMind
Literature Review

Understanding Algorithms 

 

With the sudden rise of social media impacting many facets of our lives, it also gave rise to the algorithm, a word many people know but not many fully understand. This rapid rise has also completely reshaped how people consume news and media, with many users citing social media as a primary source. Traditional media outlets such as newspapers and television have seen drastic declines to that of a flatline and experts in the field do not foresee growth starting again. Instead, social media algorithm feeds provide a non-stop stream of personalized information to curate content that aligns with a user’s individual preferences. These algorithms were originally designed to increase user activity, however, Bloomberg found that sharing of personal stories on Facebook has dropped 21% year over year (Frier, 2016). Understanding how these algorithms function is crucial for navigating the current social media landscape and for comprehending the potential impact they have on our lives.

 

While the specifics of social media algorithms are often shrouded in secrecy, being that platforms are tight-lipped about the details of their specific curation, some common key features have been found through research and different reports (Berman, 2020). One of the main factors used while determining content is the engagement metrics – content that is likely to generate or currently generating engagement (likes, comments, shares) is prioritized higher, with comments being the most valuable on many platforms. Beyond basic engagement, algorithms also take into account your demographics (age, gender), time since posting (showing new posts), whether the post was made by a friend or just a follower, posts hovered over, watch time, and even what people near you have been liking (from approximate location). Many popular apps also include features where the users themselves can actively help in curating their content through hiding posts, blocking users, or selecting “not interested” in many cases. This in turn has an effect on content creators who must now curate their own work to better fit the algorithm to maximize success or in some cases remain relevant (Narayanan, 2023). A recent speaker at Stony Brook University Howard Shimmel explained streaming currently has the highest viewer share on televisions with YouTube being the most popular platform globally. With more people getting information from independent sources, it becomes increasingly relevant to understand the different factors that go into the creation and display of modern media. To summarize these algorithms, while each unique, serve the purpose of identifying the best content to show users by adapting and learning from many aspects of user interaction. 

 

With this understood, social media platforms have immense power in shaping our life both online and off. While the posts most likely to grab our attention will keep us scrolling, it can also easily lead to filter bubbles – “a state in which social media consumption exhibits increased polarization and segregation of consumed content and in which ideas and concepts, limited in their diversity, match the consumer’s beliefs” (Berman, 2020). Additionally, algorithms that prioritize engagement can amplify or misrepresent content with studies finding that when algorithms selectively amplify more extreme political views, people begin to think that their political in-group and out-group are more sharply divided than they really are (Brady, 2023). These potential downsides show the importance of researching how social media algorithms influence our perception of the information we consume.   

RQ1: What are social media algorithms and filter bubbles?

 

Understanding the Algorithms Impact of Filter Bubbles on Perception of Media 

 

While filter bubbles have been theorized to distort user perceptions online, the specific impact on the perception of current affairs such as politics, health and sciences remains largely unexplored.. Research has shown that human attentional biases in favor of emotional content occur online even more frequently than they do offline and since social media algorithms are designed for engagement, they end up amplifying this content even more. (McLoughlin, 2024) By exploring how filter bubbles and the algorithm interplay off each-other we can better understand why and how they form and how that contributes to our perception of  the world around us. With political polarization, health misinformation and skepticism with science becoming an unfortunate modern reality, the role filter bubbles play in spreading this information should not be underestimated. 

 

Despite enabling users access to a diversity of information, social media has been linked to increased societal polarization (Flaxman, 2020). By automatically recommending users content they are likely to agree with, users are often skewed towards more extreme beliefs even if those beliefs do not match reality. As this filtering continues to get better, the amount of effort needed in actually choosing what we’d like to see will continue to decrease. This personalization is changing our experience in receiving news while also changing the economics behind what stories get exposure (Pariser, 2011). Traditional means of learning from opinion leaders often relied on expertise and trust however the rise of social media has started to reshape this dynamic. As previously mentioned, content creators will often curate their own content to better fit the algorithm and news agencies will tweak their titles to be more extreme or engaging. Recent speaker S. Mitra Kalita recounts this personally – while working for CNN her job was tweaking news titles and choosing which stories would create the most appeal, shares and recommendations. This interplay of changing how content is presented to better fit economic metrics is not something new but it is not something we have seen at this scope and scale before. Algorithms not only select and broadcast the voices of these opinion leaders but also actively incentivize them to fit within their known boundaries, calling into question whether credibility and accuracy is sacrificed for sensationalism or confirmation bias within their audience. (Mahadevan, 2020) When agencies are being incentivized and rewarded for 

These filter bubbles and their interplay with social media algorithms, content creators, and mainstream media create challenges to our understanding of current affairs. While we now have access to practically limitless information and opinions, the role the algorithm plays in receiving information can often lead to distorted perspectives and amplified polarization among users. We currently exist within a system that rewards engagement over accuracy, driving opinion leaders to also prioritize engagement over accuracy. What was once a space for human interaction and sharing has transformed to a sophisticated machine controlled by consumption-driven algorithms (Walter, 2024). This problem is being further complicated by the rapid rise in AI-generated content and the fact that many users already have difficulty discerning between real and fake content. As a result, misinformation that engages “moral” emotions gains excessive traction online because of human attention biases being further amplified by algorithmic curation – damaging public cohesion and even potentially affecting political outcomes (McLoughlin, 2024). Further research is crucial in understanding the full processes at play and how they influence how we interpret the information we receive.  

 

RQ2: What role do filter bubbles play in influencing perception of current affairs (politics, health, science)?

Photo courtesy of Google DeepMind
The Hidden Influence

The project design is a short documentary, featuring a mix of voiceover narration and a combination of original footage and videos from other sources.

The primary aim of this documentary is to broaden viewers’ understanding of social media algorithms and their influence on daily life. By taking complex topics and condensing them into more digestible pieces of information, the documentary aims to make the content accessible and engaging for the audience.

 

Community Outreach

Project Look Sharp is a non-profit organization based out of Ithaca College. Their mission is to help educators enhance their students’ critical thinking, media literacy, and metacognition through professional development. It was founded in 1996 with the mission to provide kindergarten through college educators with the right resources to teach students how to engage effectively with modern media. They encourage educators and students to not only analyze the media messages they encounter but to also reflect on why they interpret media the way they do and why others may understand information differently. They offer many resources and lessons which are available for download on the project’s website and also on YouTube. 

This project exploring filter bubbles could greatly benefit Project Look Sharp, it aligns with their mission to increase media literacy among both students and staff. The documentary is educational and explores the impact of algorithms, filter bubbles, biases, and how they play a role in your everyday life. This documentary could serve as a valuable resource in Project Look Sharps’s collection of development programs, giving educators another resource to address filter bubbles and also create meaningful conversations. Helping to ensure the youth is educated and aware of the presence of filter bubbles is a crucial step in navigating the digital world effectively. 

Conclusion

This project set out to better understand the influence of filter bubbles on the information we receive and in turn how that affects our understanding of current events. Filter bubbles and algorithms can be tricky and unavoidable, and the information surrounding them is scarce and not well understood. Research around this phenomenon is still currently being done, with much of the literature currently finding mixed results. Despite this, during thorough analysis, it was found that most young people are aware of the filter bubble phenomenon and get enough outside information that they are not forcibly entered into a filter bubble. Going forward, ensuring accountability for major tech companies can help promote transparency, mitigate filter bubble risks, and combat the spread of misinformation.

 

View Story Comments
More to Discover