online student evaluations?

The topic here is online vs. paper evaluations of course by students. I’m a department chair. The staff person who handles our paper evaluations (mandatory for instructors to administer) says we’d save a lot of labor with online evaluations, agrees that the problem with online evaluations is low response rates, and asks whether this could be solved by not releasing a student’s grade until they have done the evaluation. What is the experience on other campuses? What are response rates to online evaluations? Are there any systems in place to require that students evaluate? NOTE: Students can turn in a blank evaluation form or refuse to fill it out, and the staff member proposes that the student’s right not to respond could be coupled with mandatory evaluation by having the first question be “I prefer not to answer” which would get credit for doing the evaluation. What are your campus’s policies?

Author: olderwoman

I'm a sociology professor but not only a sociology professor. I keep my name out of this blog because I don't want my name associated with it in a Google search. Although I never write anything in a public forum like a blog that I'd be ashamed to have associated with my name (and you shouldn't either), it is illegal for me to use my position as a public employee to advance my religious or political views, and the pseudonym helps to preserve the distinction between my public and private identities. The pseudonym also helps to protect the people I may write about in describing public or semi-public events I've been involved with. You can read about my academic work on my academic blog --Pam Oliver

12 thoughts on “online student evaluations?”

  1. A former colleague had a strategy I liked. They were able to see class-based response rates. He built in some extra credit / participation grad bump / whatever for the entire class if the collective response rate exceeded some pre-determined threshold (e.g., 80%). He said it tended to do the trick.


  2. I’ve seen online evals at two universities and was at one when we switched from paper to electronic. Response rates with online are lower across the board and the dispersions get wider as the people who actually do the evals are the ones who either really like, or really hate, the class. In short, I think online evals hurt data quality. The best method for getting good responses I’ve ever found is to give them in-class at the START of the class rather than the end. That way, the longer the students spend writing comments, the less of the course material they actually have to listen to. Doesn’t necessarily make your quant scores higher, but you tend to get qual feedback that actually gives you a picture of what’s going on.

    I realize that online evals are seriously labor saving, but if anything important is in any way tied to student feedback (e.g., tenure decisions) I think the savings are a false economy.


  3. We just moved to electronic evals. I gave them in-class time to do them and reminded them for about a week after – I got a 33% response rate. Better than I expected, but still not great.

    Something else that wasn’t done (and could be) is a linking of eval responses to final grades – they get links to the evals through our course management system. This could be aggregated to keep anonymity, but when I see a wide variance for a particular score, say “the class was interesting” or “the instructor seemed prepared” it would be nice to know how much of that variance was explained by course grade.

    Liked by 1 person

  4. My institution implemented online surveys a couple years ago to horrible results. For whatever reason, the administration has refused to make them mandatory but instead delays the release of grades by a few days for those who don’t participate. Not surprisingly, response rates dropped from above 80% to below 20% for most course sections. The administration has transferred the burden to faculty, suggesting that professors reserve computer lab space and sacrifice class time to up those rates. It hasn’t helped that the administration has been relying on (i.e. paying a lot of money to) out side “consultants,” who are grossly misinforming them about power and sample size and use questions that are not consistent with the higher ed research on such evaluations.


  5. At my SLAC, we have the same policy as above (students have to wait longer to see their grades if they don’t complete them). Response rates dropped the first year, but now they are back around 80%.

    System automatically sends reminder emails. Also, students can fill them out on their phones (for better or worse).


  6. At my institution, departments (and individual faculty, even) get to decide whether to do it online or in paper, and it has been this way for years, so there is a considerable track record. We do not have a policy that requires students to fill one out to have their grades released or some such thing (the IRB would disapprove!). If you do just online evaluations, you get what others here have reported: low response rates and polarized ratings.

    The simplest solution is simply to take class time to have students fill them out–just like you do with paper evaluations. That gets the response rates equivalent to the old paper method (over 80-90%, depending on attendance), saves staff a ton of work, and doesn’t impose any extra burdens or B.S. policies on students. Here’s how it works:

    Every student gets an email with a link to the online evaluation from the department administrator at a pre-scheduled time. During class, you say you’re going to take 10-15 minutes for course evaluations, tell the students to check their email for the link, and give them 3 options: (1) do it on your laptop/tablet; (2) do it on your smartphone; (3) go to a computer in the computer lab down the hall. Because almost all of our students carry smartphones everywhere, they have no trouble filling out the evals online. If a student happens to be gone from class that day, they can still fill out the eval on their own time.

    In my opinion, this preserves the benefits of the paper evaluation but makes it much easier for all parties involved.


  7. My experiences are much like @kenkolb. The first couple years after the switch, I had a class with a 67% response rate, but since then (the last five years) they’re all above 80% and many well above that.

    ND has the grade waiting period for folks who don’t submit them, which helps. As far as what I do, I remind students in class, tell them how much I value their feedback, and will often find some sort of incentive. This year was the first one that I did what @jimi suggested – an extra point on the final exam if we got above a 90% response rate (ended up with 94%). Most other semesters, I have my two classes compete and bring donuts or chocolate or something to the class with the highest rate’s final exam. I’ve never allowed students to complete them in class, but also don’t allow laptops. I’d consider such a policy in the future.

    @micah: We get students’ expected grades in our final breakdown, but not attached to the reviews themselves. I’ll bet that someone in institutional research looks at those correlations, but we don’t get to see them. Maybe your school would add them with a suggestion?


  8. It’s been interesting collecting responses here and on FaceBook and learning about the diversity of approaches. My conclusions so far: (1) online evaluations with low response rates are worse than no evaluations; (2) anything that links grade release to doing the evaluation is a campus-wide implementation.

    I think nearly everyone is likely to agree with me that the comments on evaluations are much more useful than quantitative responses, at least to the questions used on our forms (which have not changed in the 35 years I’ve been at Wisconsin). One person noted that that comments in online evaluations tend to be shorter than on paper. Another noted that receiving a compact listing of comments from an online system is more useful to an adjunct who needs those comments as part of a teaching portfolio than the hundreds of pages of hand-written comments scrawled on the paper forms, even if they have been photocopied or scanned into a PDF file.

    FWIW I have for many years conducted my own course-specific online evaluation of my course for which I give students direct course credit (as part of a participation grade). But that is different from the hopefully-neutral data collected for evaluating instructors.

    A very big downside of paper evaluations is keeping track of the paper, especially if you consider the written comments to be the most important information on the form.


  9. Our system went entirely online in 2009, and a university-wide committee did a pretty thorough report on the evaluation system (link here: in 2012. We do not have the timing of grade release linked to evaluations. However, the committee did make one suggestion to improve completion rates (which had been about 75% for paper and went down to 50% with online) that I believe is going to be implemented next year. Students will have access to the numerical scores for classes with 60%+ completion rates — but only if they themselves completed evaluations. This would be campus-wide, of course. I don’t know if it makes much difference.

    One other thing to keep in mind with a department going online on its own — five years of data at my institution showed online scores for instructor and course overall were 0.1 lower (on a five-point scale) than paper scores. So if you’re being compared with departments using paper, you might be at some (fairly slight) disadvantage by going online.


  10. Like Jessica says, the response rate rebounded at our school for reasons she mentioned and there are things that can be done by the instructors and by the system (e.g., email reminders) that help. But what I have noticed is a serious drop off in the students’ qualitative responses. Students will fill out the numerical items, but very few take the time (with the on-line system) to write any comments at all. And the students who do comment do not seem representative of the class as a shole (e.g., the ones who love you or hate you are most likely to write something).


  11. Northwestern was doing online evals since before I got there (in 2002) – run by the university, not the department. The system was set up in a way I think was brilliant – because students use course evals to shop for classes, they would get access to a condensed version of online evals (only a few questions quant and qual out of the total questions) IF AND ONLY IF they filled out their own evals the semester prior. No need to wheedle or push; they did it themselves.

    Where I am now (NC State), it is all online, run by the university. The university incentive is that they stop receiving the daily nudge emails once they fill them out. We are not supposed to offer incentives or bribes, but most people I know do something – an extra credit question on the final exam if over a certain response percentage, etc – as well as use time in class.


  12. At my previous institution, they used to incentivize students to complete online course evaluations by randomly selecting those who completed all of their course evaluations and elevating their registration status (e.g., allowing a second year student to register earlier with the third year students).

    At my current institution, we decided to require faculty to allocate time in the last class for students to complete the course evaluations in class in order to ensure a reasonable response rate. I’ve also seen faculty offer a nominal number of extra credit points (e.g., 5 pts. towards a 1000 point course total) if students complete the evaluation or can show proof that they have done so (e.g., emailing a screencapture of the “completed” screen that appears after the evaluation is completed).


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.