facebook’s algorithm removes politically diverse content from your feed

Today, three researchers at Facebook released a new study in Science titled “Exposure to ideologically diverse news and opinion on Facebook.” The authors summarize their own findings in a companion blog post:

We found that people have friends who claim an opposing political ideology, and that the content in peoples’ News Feeds reflect those diverse views. While News Feed surfaces content that is slightly more aligned with an individual’s own ideology (based on that person’s actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter.

As several commentators have noted, this framing is a little weird.

Christian Sandvig argues that the results are written up carefully to exculpate Facebook from exactly the charge that the research supports: that Facebook’s algorithm polarizes what news users see. That users choose to preferentially click on ideologically aligned stories is besides the point. The study shows that Facebook’s algorithm “removes hard news from diverse sources that you are less likely to agree with but it does not remove the hard news that you are likely to agree with.”

Zeynep Tufekci and Eszter Hargittai offer related criticism, focusing on significant methodological issues that are buried in the Science piece. Tufekci, for example, argues that the most important finding is actually buried in an appendix: a confirmation that placement in the News Feed has a huge effect on the likelihood that users will click on a story, meaning that Facebook has its disposal the power to drive clicks towards or away from different stories. This finding isn’t unprecedented, but it’s important to see it confirmed by Facebook’s own in-house researchers.

All three pieces note that the sample used in the study consists of just two thirds of half of 4% of users – those who use Facebook more than 4 day per week, and whose ideology could be coded from self-reported ideological variables in Facebook profiles. Given the limitations of the sample, Hargittai argues that the authors’ claim to “conclusively establish that on average in the context of Facebook, individual choices (2, 13, 15, 17) more than algorithms (3, 9) limit exposure to attitude-challenging content” is unsupportable.

But again, as Sandvig notes, this comparison is weird in the first place:

The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking. Probably while they are in the coal mine. What I mean to say is that there is no scenario in which “user choices” vs. “the algorithm” can be traded off, because they happen together (Fig. 3 [top]). Users select from what the algorithm already filtered for them. It is a sequence.

Read the study, read the commentaries, and let us know what you think in the comments!

EDIT: Nathan Jurgenson has useful reflections as well, focusing on the impossibility of a neutral filtering algorithm:

Facebook orders and ranks news information, which is doing the work of journalism, but they refuse to acknowledge they are doing the work of journalism. Facebook cannot take its own role in news seriously, and they cannot take journalism itself seriously, if they are unwilling to admit the degree to which they shape how news appears on the site. The most dangerous journalism is journalism that doesn’t see itself as such.

To my mind, the “algorithm” vs. “users’ choice” framing of the study reminds me of debates about the role of genetics vs. the environment, or nature vs. nurture. As in those debates, the simple version of the framing is highly misleading. As Sandvig puts it: “algorithm and user form a coupled system of at least two feedback loops.”

Author: Dan Hirschman

I am a sociologist interested in the use of numbers in organizations, markets, and policy. For more info, see here.

4 thoughts on “facebook’s algorithm removes politically diverse content from your feed”

  1. Information asymmetries are a pernicious source of market failure, and I agree with the spirit (if not the letter) of these criticisms. But I think Facebook “taking journalism seriously” would be a mistake. New media are beautiful precisely because they’re not as “serious,” and not as subject to insular standard policing as traditional media have been.

    Would we have gotten this essay from a NYPD cop from traditional media? No, and you see the quality of Buzzfeed’s content improving quickly. That’s what competition does, and what professions “taking things seriously” and freaking out about technological progress in their industries because of failed Marxian theories doesn’t do.

    http://www.buzzfeed.com/dreamworks/why-cops-like-me-stay-quiet-about-police-brutality

    As for polarization, I think the whole thing is a measurement artifact. People always had opinions and did not voice them. Now information is cheaper and people exercise voice more often and think more critically and have stronger opinions.

    I’d rather have a polarized world where people debate and think for themselves than a 5 Major Networks spread of politics and opinions. Is the success of radical feminism online an example of the bad effects of polarization? No. People dislike polarization only when it’s the other team that’s allegedly becoming more dogmatic and fanatical.

    Like

  2. Dan, you are right that the algorithm plays a role. It’s not surprising that a lot of people would worry about this, since we aren’t fully sure how the algorithm works! But Facebook’s algorithms can’t remove the things that my friends didn’t share in the first place.

    I think Nathan Jurgenson’s critique that you quoted is interesting too: “The most dangerous journalism is journalism that doesn’t see itself as such.” But the better question is can we expect social networking platforms to function as journalism, given that social networking platforms are designed to let users friend or follow only the people they want to follow? Probably not. Skimming Facebook (and not following the links) may be a good way for people to feel like they are being informed without actually going through the trouble of seeking out new information.

    (I said more in my own post, which is in the trackbacks.)

    Like

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s