There is a story circulating that, starting this semester, all student evaluations of faculty teaching are to be done online. This seems to be a situation fraught with peril (at USM at least; it might work okay at functioning institutions). It seems to be a trap that could suffer from a range of things --- from incompetence all the way to malice.
quote: Originally posted by: road crew "There is a story circulating that, starting this semester, all student evaluations of faculty teaching are to be done online. This seems to be a situation fraught with peril (at USM at least; it might work okay at functioning institutions). It seems to be a trap that could suffer from a range of things --- from incompetence all the way to malice. "
Road Crew,
This is not a rumor. The memo has been issued. I'm reading the e-mail now. A committee studied this and made recommendations, but the administration seems to be implementing it without reading the report that points out all of the dangers and conditions that must be met to make it work properly. Faculty senators are discussing it now.
The Clemson administration pushed hard for on-line student evaluations of teaching for a couple of years before the Faculty Senate accepted it.
We've had them now for a year and a half. So far the return rate has been well below the return rate for paper evaluation forms, and some instructors' materials have been lost. I have not heard any reports of administrative tampering, however.
We still have paper forms at Clemson, but DCIT (our counterpart to iTech) has obviously been told to give them lowest priority. It takes about 4 months to get the results back.
In time, the electronic evaluations will work out OK (assuming they're not tampered with). The bigger lesson, though, is that only fools evaluate teaching on the basis of students' evaluations alone.
Who will be handling the e-evals? Will they be developed and administered in-house?
The entire South Dakota Unviersity System tried using the online ETS instrument, eSIRII, and were severely disappointed with it. Not only were the return rates poor, the reports from ETS were garbage, e.g. several of the descriptives were not related to the raw data by any mathematical or statistical methods I could replicate. This was AFTER I pointed out that ETS was computing item means by adding omitted responses as zeros. The fallout is that ETS is abandoning the eSIRII all together at the end of the summer, but they are letting us use it for free until then.
quote: Originally posted by: Robert Campbell "... The bigger lesson, though, is that only fools evaluate teaching on the basis of students' evaluations alone. Robert Campbell"
Robert, didn't you know that we operate under the principle, "The customer is always right"? That's the only evaluation that matters at USM. So it causes a grade inflation problem, so what? Keep the students happy is USM's motto.
It is my understanding that there is a financial incentive attached to these: those students who fill out the on-line forms will be eligible for one of several $250 "prizes"--winners chosen randomly, etc. Further evidence of the degraded condition of academics here.
The SGA has been pushing for online evaluations for years. They want this so that the results can be posted online, therefore letting students know who the "best" professors are.
quote: Originally posted by: asdf "The SGA has been pushing for online evaluations for years. They want this so that the results can be posted online, therefore letting students know who the "best" professors are."
The joint committee of grad council and faculty senate fomulated the questions that are in the current online version. Their report also cited a number of concerns about implementation, security, and provisions to assure a braod response. As of now, I do not know if those concerns have been discussed or will be reflected in the delivery system.
One change: the student voluntary comments have always been given to the faculty member only and were never intended to be part of the evaluation process but rather as a mechanism for course improvement. Those comments are now part of the online system and do not seem separable from the rest of the evaluation as far as I can tell.
This discussion is happening within the senate. At least one Dean has been alterted to the concerns and has forwarded them to Grimes and Exline. I have aslked that, pending resolution of these concerns that we delay the online evals until the fall.
We'll see. I think this case is just the right hand not nowing what the left hand is doing and once again this incedible rush to implement things. Incidently, this has been under discussion apparently since 2002 --
quote: Originally posted by: stephen judd " The joint committee of grad council and faculty senate fomulated the questions that are in the current online version. Their report also cited a number of concerns about implementation, security, and provisions to assure a braod response. As of now, I do not know if those concerns have been discussed or will be reflected in the delivery system. One change: the student voluntary comments have always been given to the faculty member only and were never intended to be part of the evaluation process but rather as a mechanism for course improvement. Those comments are now part of the online system and do not seem separable from the rest of the evaluation as far as I can tell. This discussion is happening within the senate. At least one Dean has been alterted to the concerns and has forwarded them to Grimes and Exline. I have aslked that, pending resolution of these concerns that we delay the online evals until the fall. We'll see. I think this case is just the right hand not nowing what the left hand is doing and once again this incedible rush to implement things. Incidently, this has been under discussion apparently since 2002 -- "
A PS -- Amy Miller was good enough to submit a copy of a report from David Swanson at UM about their online eval and its problems It is an excellent report.
quote: Originally posted by: Robert Campbell "Don't USM students use "Rate My Professor"? There are Rate My Professor sites for lots of universities. RC "
That was started a few years ago. Hopefully, Robert, you would be impressed that the students actually know something about biased sampling and that the opinions on a web site are not necessarily representative of true student opinions (mainly people who really hate or love a professor bothers to do this, you loose all of the middle people). The SGA wants the actually faculty evaluations which should be more representative of the mean. Now, reading some of the posts above, if the faculty evaluation becomes completely voluntary where the students fill it out at their leisure online, well, the evaluations become nothing more than a "rate my professor."
What I will still never understand is why student evaluations seem to be the only measure of teaching effectiveness used at USM. Does any department on campus use peer evaluations as well?
The problem isn't with using a computer to complete evaluations. It is in getting a proper sample (ideally 100%) of the students to respond.
Any instrument that relies on voluntary responses outside the classroom is absolutely invalid. If students at the beginning of class were led to a nearby computer lab to complete the survey, or completion of the survey was made mandatory (no grade without a survey perhaps), that might overcome the selection and response bias.
I certianly can see the value in saving all the manpower and paper involved with the current system. Additionally, departments could still distribute qualitative forms (which I've always found much more valuable)
quote: Originally posted by: asdf " That was started a few years ago. Hopefully, Robert, you would be impressed that the students actually know something about biased sampling and that the opinions on a web site are not necessarily representative of true student opinions (mainly people who really hate or love a professor bothers to do this, you loose all of the middle people). The SGA wants the actually faculty evaluations which should be more representative of the mean. Now, reading some of the posts above, if the faculty evaluation becomes completely voluntary where the students fill it out at their leisure online, well, the evaluations become nothing more than a "rate my professor." What I will still never understand is why student evaluations seem to be the only measure of teaching effectiveness used at USM. Does any department on campus use peer evaluations as well?"