Two items in my feed today with one theme: using the new College Scorecard to create college rankings that account for the quality of inputs, so to speak.*
Continue reading “sabermetrics comes to college rankings”
Phil has had a couple of posts now about the practice of journal editors encouraging citations to a journal that they edit, and it sounds like there may be more. I should say that I don’t recall ever having an editor say something as… direct as the statement Phil quotes, and I do remember being on projects where, on our own initiative, we’ve inserted references to a journal or the work of its editor with “can’t hurt our chances!” rationale.
One might think the specific practice of editors encouraging citations to their journal for impact-factor purposes could be curbed by simply eliminating journal-level self-citations from impact factor counts. But: my suspicion is that when people insert citations in with the idea of pleasing editors at a specific journal, they mostly don’t bother to remove those citations if the paper gets rejected from the journal anyway. In other words, when journals encourage authors to cite other articles in their journal, there’s a direct and readily observed effect on impact factor as self-citations, but then there is also this hidden and downstream effect of papers that are published elsewhere. Depending on the journal’s acceptance rate and how early in the process references are added, the indirect could potentially be substantial relative to the direct effect.
On the bright–if somewhat perverse side–the practice could actually be good for anybody who wanted to try to use networks of citations across journals to make inferences about journal prestige. Because if a publication follows a chain of Journal A -> Journal B -> Journal C -> Journal D in order to get published, Journal D will have the traces of efforts to please Journals A, B, and C, including citations to those journals, whereas if it had been accepted by Journal A, it wouldn’t have traces to please B, C, and D because it was never sent there. Put another way, the order in which authors send articles would be a good way of sussing out the hierarchy of journal prestige, but that’s private information, but authors including gratuitous citations to those journals and then leaving them in is a way in which that private information can be made visible.
Academic “quit lit” is a large and probably growing genre. We’ve all seen it, agreed with some bits, disagreed with others. Today, I read a new essay in reaction to quit lit by Matthew Pratt Guterl that I found moving: What to Love. Here’s how it opens:
Let me tell you what to love.
Let me tell you why to stick it out.
Let me tell you why not to quit.
Like Tressie MC’s critique of quit lit, Guterl objects to the hyper focus on work and the individual. Read the whole thing, it’s short, and hopefully you’ll find it as inspiring as I did.
Just in time for Hallowe’en, Phil Cohen has posted an account of a recent experience of trying to publish an article. The account is more striking when one pauses to think that the story is not getting told because it is extreme in a discipline-wide sense, but that it’s extreme for one of the few folks who write blog posts about things like this. In other words, too many people with too many papers are ending up with these sort of stories.
I appreciated Phil’s forthrightness in the account, particular the part where he reproduced one editor’s request to insert citations to more papers from their journal.
Beyond that, I was particular fond of this paragraph of the summary:
Sociologists care way too much about framing. Most (or all) of the reviewers were sociologists, and most of what they suggested, complained about, or objected was about the way the paper was “framed,” that is, how we establish the importance of the question and interpret the results. Of course framing is important – it’s why you’re asking your question, and why readers should care (see Mark Granovetter’s note on the rejected version of “the Strength of Weak Ties”). But it takes on elevated importance when we’re scrapping over limited slots in academic journals, so that to get published you have to successfully “frame” your paper as more important than some other poor slob’s.
Last month, ASR announced they would be publishing guidelines for reviewers of qualitative, theoretical, and comparative/historical papers. Today, draft versions of the historical guidelines were released (pdf version). Here’s the message that Monica Prasad posted on the CHS list today along with the guidelines:
“The committee to draft guidelines for comparative historical sociology articles in ASR has finished its work, and the draft guidelines are attached here. The committee consisted of Richard Lachmann (chair), Greta Krippner, George Steinmetz, Melissa Wilde, Nicholas Hoover Wilson, and Xiaohong Xu. Thank you to the committee for doing such an excellent job, and let’s all hope that the end result is more fabulous CHS articles in ASR!”
I’d love to know what you all think of them.