spending endowments in a recession

A friend (at Duke, nonetheless no less) and I were talking about the fact that, in the recession, both our institutions have used their considerable endowments pretty conservatively; my undergraduate college, Swarthmore, has been similar. Essentially they treat the interest coming off the endowment as current income, preserving principal.

The issue, though, is that this makes endowment spending cyclical, basically correlated with income from other sources, such as state funding (at a public university), tuition raises, donations, and even grant money. I imagine that a less risk-averse university could actually claim impressive returns to its endowment by spending counter-cyclically. Certainly these benefits could be intellectual or mission-based, as in the ability to hire faculty for relatively little money because of weak job markets. But I imagine the benefits could be financial as well, in the form of increased alumni donations, potential revenues from discoveries, grant income, and so on. What am I missing? Why does it seem like nobody is seeking to spend endowment money aggressively this way?

hmm.

Top 10 sociology programs in terms of quality of graduate students, using the primary measure in the NRC (average quantitative GRE score):

1. UNIVERSITY OF IOWA
2. STANFORD UNIVERSITY
3. YALE UNIVERSITY
4. PRINCETON UNIVERSITY
5. UNIVERSITY OF CALIFORNIA-BERKELEY
6. UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL
7. UNIVERSITY OF MICHIGAN-ANN ARBOR
8. HARVARD UNIVERSITY
9. COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK
10. NEW YORK UNIVERSITY

The other measure of graduate student quality is percentage of first year students with external fellowships. That top 10:

1. TEMPLE UNIVERSITY
2. UNIVERSITY OF ARIZONA
3. WAYNE STATE UNIVERSITY
4. UNIVERSITY OF NORTH TEXAS
5. HARVARD UNIVERSITY
6. UNIVERSITY OF CALIFORNIA-SAN DIEGO
7. UNIVERSITY OF PENNSYLVANIA
8. PRINCETON UNIVERSITY
9. UNIVERSITY OF CALIFORNIA-LOS ANGELES
10. OKLAHOMA STATE UNIVERSITY MAIN CAMPUS

the scatterplot official* ranking of sociology graduate programs

After seeing the NRC ranking of graduate programs, my first impulse was to simply ignore it.  The process has been a mess–and now the results reflect that mess perfectly.  My second instinct was to write a post encouraging everyone else to ignore it.  Obviously, that’s not going to happen!  It’s so hard to stop myself from responding and pointing out the many flaws and the really bizarre results.  But it won’t be long until everyone else deals with that task–so instead I think what we really need is an alternative.

The NRC study cost millions of dollars and tens-of-thousands of person hours to create.   I want to be cheap and fast–and better than the NRC ranking.  Not way better, not perfect, just better.  So, I give you the Scatterplot Official Ranking of Sociology (SORS), constructed in less than 15 minutes using a very carefully constructed proprietary algorithm (see footnote 2 for details).  You will no doubt find surprises and program positions you’d quibble with–but anyone who compares my results to the “S” rankings of the NRC will immediately declare mine to be superior and more useful.  I submit, therefore, that you will be more than justified in using the SORS in any future program evaluation, discussion about the state of sociology, or program initiative.  Continue reading “the scatterplot official* ranking of sociology graduate programs”

nrc rankings

The NRC rankings appear to have driven me out of blogging retirement. Here are some understandings I have about the rankings for sociology after reading material from the Chronicle and the report’s Appendix. Corrections welcome.

1. Books are not counted in the publications per faculty member figure. At all.
2. Citations to books are not counted in the citations per faculty member figure. At all.
3. Multi-authored publications are counted as 1 publication per each author and 1 citation for each author in the citation count.
4. The average GRE score figure is based on the quantitative GRE only.

Also, if you are wondering about the #1 sociology programs in some key areas:
Most publications per faculty member: University of California-San Francisco
Most citations per faculty member: University of New Hampshire
Average time to Ph.D. for students: Bowling Green State University (3.25 years!)
PhDs with academic jobs: University of Miami
Average GRE score: University of Iowa
Percentage of students completing in 6 years or less: Baylor University
Percentage of new students with external grants: Temple University

two major lessons from today

1. Reputational rankings: maybe not so bad after all.

2. Uncertainty: if one is going to provide multiple sets of rankings and confidence intervals as a way of gesturing toward the uncertainty in the evaluation process, one might also consider, for example, trying to model the uncertainty of how to go about counting books relative to articles, rather than simply counting a book as one article and leaving it at that.

More could be said. Well, I suppose I should also say that, in case anyone looks at the NRC spreadsheet and detail and wonders, Northwestern University does indeed provide “Instruction in Statistics.” And did so in 2006! So I’m not sure how we came to be tallied as not offering that.

ask a scatterbrain: when you just disagree

This, from a fellow junior faculty member at another University:

I’m serving on a committee that has just been asked by upper administration to strategize how we can pursue a new aim they’ve settled on on for the university. Here’s the thing – I don’t agree with the new aim. So, developing a strategy for it isn’t high on my priority list. What can/should a junior faculty member do when serving on a committee that’s asked to do something by the administration that they don’t agree with?