To encourage student evaluations, I had a "demonstration" on how to submit them. I also announced it every class period towards the end of the semester, stressing how important it was.
- Out of approximately 60 students in multiple classes, I got (If I'm not mistaken) 13 responses.
Good thing we went to the on-line assessment without pilot-testing it.
The results from the spring online evaluations of teaching should be distributed this week.
Results from the spring pilot will go only to the faculty member.
Aggregate data will not be provided to chairs and deans, primarily because the response rate in the spring was so low as to question the validity of the data.
Response rate in the spring was initially calculated at 18.8%, though there may be some errors in that calculation that could push the rate to around 25% (In the past, the paper-based evaluations had a response rate around 76%).
Results from the summer online evaluation will likewise go only to the faculty member.
It appears that paper-based evaluations may not be available for future use.
A committee has been formed to look into the future of student evaluations of teaching.
...
William W. (Bill) Powell, Ph.D. President, USM Faculty Senate
Participation low in online evaluationsBy Reuben Mees http://www.hattiesburgamerican.com/apps/pbcs.dll/article?AID=/20050714/NEWS01/507140308/1002
"Robert Burton, also a senior psychology major, said he went with a group to fill out the evaluations . . "If they gave bonus points or something, I think a lot of people would have done it," he said."
Bonus points for completing a faculty evaluation? What values do we impart in our students? Bonus points for attending class would also be nice.
Based on my experience in this area -- which is extensive -- the only classes in which one may reasonably expect a decent response rate for online student evaluations of faculty are online classes.
Online evaluations do work fairly well for faculty evaluations of supervisors & other types of employee evaluations. But the only "automation" that works for student evaluations is still Scantron.
I can't believe that USM even bothered to pilot test this.
The CoB faculty were being told to make a copy and turn a copy into their department. Now I am hearing they are being told to turn them back in due to errors in the data. Is any of this true?
The CoB faculty were being told to make a copy and turn a copy into their department. Now I am hearing they are being told to turn them back in due to errors in the data. Is any of this true?
Why would the faculty be required to submit a copy to their department when the results went to faculty only and were not intended for aggregate review? If the individual results are turned in to the department, the chair and ultimately the dean can create their own aggregate data report, valid or not.
Are there ways to get the response rate up? How about making the submission of an evaluation a requirement to access grades on SOAR?
Any other ideas out there? Have any other schools used these evaluations successfully. If our administration would consult faculty they would work out these kinks long before any change in system was implimented.
Are there ways to get the response rate up? How about making the submission of an evaluation a requirement to access grades on SOAR? Any other ideas out there? Have any other schools used these evaluations successfully. If our administration would consult faculty they would work out these kinks long before any change in system was implimented.
Ole Miss uses online evaluations for all courses. The evaluation submission is required in order for students to access grades online. I understand they have a fairly high response rate, around 85%.
Googler wrote: Ole Miss uses online evaluations for all courses. The evaluation submission is required in order for students to access grades online. I understand they have a fairly high response rate, around 85%.
That would work. It does require pretty tight integration with "web services." IIRC, Ole Miss is using a one-off administrative software system (can't recall the vendor at the moment) & based on my own experience with their system, it's amazing they can get that kind of return rate, because they have immense problems generating a simply mail merge or sending email that actually arrives in the recipient's in-box.
Vict, last I heard they were using something called Bravo, or some similar name. Whenever people complained about PS, the response was -- at least it's not as bad as what UM uses.
The student's response about extra points is clear: most USM faculty bribe students with bonus points to get better evaluations, better participation, fewer problems in class, and fewer visits during office hours. I don't know why you all just don't admit that you hate USM's students and that you can't stand to be around normal people.
The student's response about extra points is clear: most USM faculty bribe students with bonus points to get better evaluations, better participation, fewer problems in class, and fewer visits during office hours. I don't know why you all just don't admit that you hate USM's students and that you can't stand to be around normal people.
The trolls are out early this morning. Don't feed the trolls.
Based on my experience in this area -- which is extensive -- the only classes in which one may reasonably expect a decent response rate for online student evaluations of faculty are online classes. Online evaluations do work fairly well for faculty evaluations of supervisors & other types of employee evaluations. But the only "automation" that works for student evaluations is still Scantron. I can't believe that USM even bothered to pilot test this.
Well, I've been gone for a while, but from my conversations with former colleagues, USM never "pilot-tested" anything related to online evaluation. Instead, there was a mailing to all faculty and staff, sent near the end of the semester, announcing that online evaluations were to be instituted as of the end of the term.
Nobody ever knew why the "old" method was discontinued. When I was at USM, course evaluations were done with Scantron sheets and #2 pencils, administered during the last two weeks of class by neutral observers, and with statistically-designed questions extracted from a Purdue University evaluation form.
Of course, that's also "institutional memory," which we've discovered has been erased during the SFT presidency. Since Shelby hasn't taught a class to undergrads in decades, it's not surprising that he had no knowledge of the course evaluation procedures in place when he assumed the self-proclaimed theocracy.