that faculty impact “study”

I got a call this morning from the Daily Tar Heel because, while UNC was dead last among the 94 universities covered in the study Kieran has been mocking for its invention of an MIT sociology department, I am apparently the third-most-impactful faculty member in that dubious list. Talk about damning with faint praise.

So I went to the website to read the study methods. The first thing I noticed was that the methods statement was a PDF file provided in .zip format. Hmmm. A really big, detailed file? No. 17 pages. All text. They saved 27K by zipping it. That way you have to download, unzip, and open it. Who thought that was a good idea?

On second thought, perhaps hiding the methods document from readers was prudent. Most of the methods discussion is actually a kind of general discussion of rankings, particularly the NRC rankings, and why they’re bad. Or something. Or why people have criticized them.

The actual methods description reveals that they used Google News Archive searches to count media mentions of scholars at each university. Because “the Google News Archive Advanced… does not always produce the same citation count when a search query is repeated” (p. 5), and indeed “in a number of cases, the variation between queries extended beyond a single standard deviation,” they averaged three searches for each scholar. One might be a bit puzzled at why a straightforward measure against a theoretically static database might return sometimes dramatically different results, but the document is curiously unconcerned with this threat to validity.

Faculty were arbitrarily limited to 50 citations, even though some had a very large number above that. The reason is… Google News wouldn’t display a longer list. That’s why many faculty members’ scores are 50 in the data.

So then faculty’s and departments’ scores were divided by their share of federal research funding among the full 94. The document helpfully cites Wikipedia (I kid you not) as a source for the fact that dividing by zero produces no meaning. Apparently this is the only statistical or mathematical principle on which they felt consulting an authority would be beneficial.

Since they wanted high citations, low federal funding to be a high outcome, they just substituted .01 for 0 in cases of no federal funding. Since the logic of the study is “bang for the buck,” and the website claims it is interested in “social science faculty who receive billions of public dollars for their research,” this decision seems odd (there are no bucks in these cases). The federal shutdown prevents me from checking, but I have a feeling there are very few social science faculty who receive billions of public dollars; in any case, those who receive none should probably have been excluded on the grounds that they are irrelevant to the project’s stated purpose!

There’s very little information on the website about the Center for a Public Anthropology, which issued the report. In fact, there s no evidence that the Center includes anyone except Dr. Robert Borofsky, who charmingly if vaguely explains the project in a video address. But the Center clearly thinks that citations in the news “enrich the public conversation,” and should therefore be encouraged. Some of the mentions in a search of my name are research-related, focused mostly on political implications of my research, but many are about the UNC honor court, grade inflation, and other matters that are not based on my research at all.

Dividing by federal research funding gives a huge advantage to departments with low-status departments (or nonexistent ones), and the implication of the methodology is that the most “enriching” faculty member will be the one who has no funded research program and comments frequently to the press. That doesn’t strike me as the most appealing ambition.

Author: andrewperrin

University of North Carolina, Chapel Hill

6 thoughts on “that faculty impact “study””

  1. A quick look at the entry for my own institution also reveals the data was neither given a cursory cleaning (two entries for one professor) nor was it checked against any recent list of faculty (unless Chris Weiss is still on the sociology faculty at CU and is just hiding in his office).


  2. Did you notice that—because they piggy-backed on an earlier thing they did—if the university doesn’t have a Ph.D program in Anthropology then it isn’t included in the sample? This has the effect of leaving out, e.g., every social science department at Georgetown.


  3. Another limitation of google news archive is that it does not include most television and radio content- though it’s an open question whether my recent appearance on Dancing with the Stars should count as “public impact.”


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.