There’s a lot of social science triumphalism about the accuracy of Nate Silver’s predictions in the election. I’m certainly happy. But, does sociology as a discipline deserve to be gloating? From where I’m sitting, Nate Silver contradicts at least a couple things many sociology methods teachers have been telling their students for a long time.
1. Silver’s projections were based on a meta-analysis of state polls that did not come close to what many sociologists have regarded as basic standards for publishable data. It’s not like Silver did anything (too) magical with his own analysis: reweighted state polls proved to be quite accurate. Sociologists have imagined that all kinds of inferential problems arise from nonresponse that cannot be resolved by reweighting. In this view, Silver’s analysis should have been garbage in, garbage out.
To see what I mean, let’s look at something one sociologist said this summer on this very issue (but in a very different political context):
“Half-ass data like that from Pew may be novel enough on some questions to justify publication in a third tier journal, but you can forget sending shit like that to top 10 journals if I have anything to say about it. And, even “high quality” BS data like Pew are completely unacceptable when the question makes inferences about population characteristics—Using Pew or Gallup to say how many people identify with a religious group, for example, or what proportion of the population supports same sex marriage. Who fucking cares what the Pew data show? Not me. And the Pew data are now considered excellent! Back when I was an undergraduate, Doug Eckberg would have failed me in Survey Research Methods if I only garnered a 20% response rate for my interviews…”
To be clear: estimating the percentage of people who are going to vote for Obama is precisely “making inferences about population characteristics.” Indeed, it’s a more difficult task than estimating the proportion of the population who supports gay marriage, because it isn’t just a question of estimating population sentiment, but laying that over a model of what people in the population are actually going to turn out and vote. The position “Who fucking cares what the Pew data show?” because of low response rates implies the pre-election position “Who fucking cares what Nate Silver says?”
Granted, the idea that response rates ought to be important is both intuitive and plausible. And yet it still requires evidence. The surprise is how, despite all the increasing impediments to getting people to participate in surveys, brief-field-period-low-response-rate-with-reweighting polling still apparently works.
Sociology methods teachers have long provided various rules of thumb about how surveys ought to have an X% response rate, or else they are crap. Frankly, we need to shut up about this and reconsider what the current evidence says.
2. Many sociologists have been down on “objectivity” for a long time. The idea is that one cannot separate one’s values from one’s analysis, and even supposing that one might be able to do so is folly. Silver has said he votes mostly Democratic and that he supports Obama, but yet also that these beliefs do not influence his forecasting. Critics said that his forecasts were obviously being influenced by (or being devised in the service of) his politics. Seems to me that, from a consistency standpoint, many sociologists should have been congenial to the notion that these critics were right.
Instead, the error in Silver’s forecast was largely in not being bullish enough about the Democrats’ prospects. What will be an interesting test of Silver’s commitment to scientific principles — and those of all of us who are lauding him right now — is what happens when Silver is running numbers for an election that is not going Democrats’ way. Will his analyses be overoptimistic to avoid alienating his current fans? If not, will his current fans bemoan how he appears to have turned traitor?