In the COB, of the 6,883 students reported in classes, 1,337 responded to the new online evaluation system (Spring '05). That's a ratio of 0.194246695 (for Spring semester '05).
In the COB, of the 6,883 students reported in classes, 1,337 responded to the new online evaluation system (Spring '05). That's a ratio of 0.194246695 (for Spring semester '05).
Can you explain where you got this information? And is that information available on the other colleges?
accosted wrote: In the COB, of the 6,883 students reported in classes, 1,337 responded to the new online evaluation system (Spring '05). That's a ratio of 0.194246695 (for Spring semester '05). Can you explain where you got this information? And is that information available on the other colleges?
Spring 2005 evaluation have been distributed in the CoB. Those data are given on the printout.
stephen judd wrote: accosted wrote: In the COB, of the 6,883 students reported in classes, 1,337 responded to the new online evaluation system (Spring '05). That's a ratio of 0.194246695 (for Spring semester '05). Can you explain where you got this information? And is that information available on the other colleges? Spring 2005 evaluation have been distributed in the CoB. Those data are given on the printout.
Hmmm - with a response rate of <20%, and given that the student evaluations are a major component of the teaching section of our annual evaluations, do you think some testing for non-response bias is in order?
Do you really believe they would test for a non-response bias? But we all know that when January arrives these numbers will be the most believed numbers on campus to determine how faculty are evaluated. Well if they are bad, now if they are good then what will they use to adjust a faculty member's evaluation downward. That's right - those criteria that no one knows about or known as the "kiss up" criteria.