When you scroll through your Facebook news feed, you most likely see a slew of funny videos, sponsored posts and news articles. Facebook’s mission statement, posted on its investor relations page, states its goal is to “give people the power to build community and bring the world closer together.” Yet, the endless stream of content from businesses and media outlets has made passive scrolling much more common than active engagement.

To address this issue, Facebook founder and CEO Mark Zuckerberg announced Jan 11. that Facebook is shifting its focus back to its roots — connecting with friends, family and groups. One key part of this change will be to alter the algorithm behind its news feed so content created by a user’s immediate community is more prominent than posts from businesses, mainstream media outlets and other major pages.

At first glance, this change seems positive. As entertaining as binging BuzzFeed videos may be, commenting on posts written by friends seems more fulfilling. As Zuckerberg wrote in his Facebook announcement, research indicates connecting with family and friends improves well-being and happiness more than consuming random news.

However, this algorithmic change does not fix the broader issues social media has exacerbated: echo chambers and an increasingly polarized society.

A 2017 study by the Pew Research Center found that 67 percent of Americans get at least some of their news from social media.

Much of this news is biased toward the viewpoint of the consumer. To increase metrics like engagement and time spent on platform, tech giants like Google, Facebook and Amazon have engineered algorithms that generate content tailored to users’ interests. This targeting of users makes them happy, as they see what they like. However, this strategy is also dangerous because it limits the range of viewpoints to which people are exposed.

Take two Facebook users: one liberal and one conservative. The liberal user’s news feed is likely full of articles from HuffPost, The New York Times, Politico, CNN and Vox, while the conservative user’s news feed probably displays articles from Fox News, National Review and Breitbart. If the two users lack shared standards of evidence or an understanding of different views, civil debate is nearly impossible.

Wall Street Journal’s Blue Feed, Red Feed program, which shows a sample “blue” feed alongside a sample “red” feed, illustrates the disparity in political news on Facebook. The Google Chrome add-on PolitEcho allows users to visualize the political biases that exist on their news feeds. As a college student who hails from California, I was not surprised to learn my feed is drowned in blue.

Rather than being exposed to diverse viewpoints, social media users often tend to stick with — or even cling to — communities that look or think like them. When these communities blindly reinforce each other’s beliefs, they create echo chambers.

Echo chambers are dangerous because people within them derive truths about the world from mutually reinforcing sources, which may or may not be based in reality. This social condition explains why people believe ludicrous stories like the Pizzagate scandal and bizarre jokes like the Gorilla Channel, and why fake news spread so rapidly during the 2016 election.

So, how will Facebook’s renewed focus on personal connection affect these pressing challenges to democracy? Users will certainly be exposed to fewer news articles — real and fake. Beyond that, these changes may do little to combat our increasing fragmentation. In fact, if one’s friends and family share similar beliefs, the echoes may even get louder throughout one’s network.

Because so many people get their news from social media, Facebook must work to make its users not only “happier,” but also more accurately informed. Facebook users should not only interact with friends and family, but also engage in political discourse with people from diverse backgrounds and different worldviews.

Within Facebook, changes are being made — slowly yet surely. Product teams are rolling out a feature called Related Articles, which suggests to users additional articles that provide different perspectives on the news displayed on their feeds.  

“We have a whole team working on this problem and trying to create better incentives for civil, constructive conversations,” Samidh Chakrabarti, product manager of civic engagement at Facebook, wrote in a Jan. 22 blog post.

In the meantime, perhaps the onus is on us as individuals to check our cognitive biases, consume news from different sources across the political divide and seek connections with people who challenge our perspectives. After all, ignorance is bliss, but knowledge is power.

Sabrina Ma is a senior in the College.

Have a reaction to this article? Write a letter to the editor.

Leave a Reply

Your email address will not be published. Required fields are marked *