why are headlines so bad at causality?

Every major media outlet has been reporting on the big JAMA paper by Chetty et al on income inequality and life expectancy. I haven’t read the paper yet, but I’ve been following the coverage. As is true of any complex social science finding, the details can be tricky to report. Ezra Klein at Vox.com does a pretty good job of explaining the article itself, how it fits into existing findings (for example, about the relatively small effect of access to health care on mortality) and how the researchers approached competing explanations.

The big story, for Klein and for others reporting on the piece, is the surprising importance of geography (which builds on a long tradition of work on the importance cities and neighborhoods in both economics and sociology). Klein, summarizing at the end of his piece, writes: “A broader political lesson of the study is that whatever policies are driving the differences here, they’re happening at the state and local level, not just (or even mainly) at the federal level.”

Tackling the thorny issue of whether this geographic effects are causal, Klein explains nicely:

Sadly, the study isn’t able to prove anything even approaching causality for any of these explanations. It also can’t tell us if there’s potentially some unobservable — and thus uncontrollable — difference between the kinds of poor people who move to big cities in rich, blue states and the kinds of poor people who don’t. And if there’s any clear message embedded in the findings, it’s to be careful making sweeping pronouncements based on suggestive, but far from proven, observational data.

So far so good! How wonderful it would be if more coverage of articles like this included such nuance – engagement with the literature, reporting doubt, discussions of methodology, and especially skepticism of making strong causal claims based on complex models built from observational data. Unfortunately, none of that nuance made it into Vox’s clickbait-y headline:

Screen Shot 2016-04-13 at 1.51.26 PM

Sigh. Why does this happen? I mean, I guess there are the obvious reasons that underlie all clickbait headlines. But is there something more specific about this genre of clickbait, the interesting-correlation-reported-as-causal? And not just causal, but translating into a narrow policy or personal prescription? What can we do about it?

Author: Dan Hirschman

I am a sociologist interested in the use of numbers in organizations, markets, and policy. For more info, see here.

2 thoughts on “why are headlines so bad at causality?”

  1. Agree. One problem is the contortions they go through to phrase headlines in the second person. Here’s one on that: https://familyinequality.wordpress.com/2012/05/01/hello-you-shall-we-walk-in-a-samples-shoes/.

    On the Chetty paper itself, really not sure about this. They have two datasets that don’t include race (IRS & SS), then do some sort of population-level control/projection for race, and end up with maps that look suspiciously like % Black maps — again. So “move to California” might not be the most pertinent advice.

    Liked by 1 person

  2. As you probably know, the job of writing headlines is not done by the authors of articles. But it seems to me that this problem of false causality in interpreting data results also shows up (usually in more muted form) in the descriptions of regression coefficients in published professional research.

    Liked by 1 person

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s