Members Login
Username 
 
Password 
    Remember Me  
Post Info TOPIC: latest HA articles
Otherside

Date:
RE: RE: RE: RE: RE: RE: RE: latest HA articles
Permalink Closed


quote:

Originally posted by: stephen judd

"Yes -- I think that is where the confusion has been among many of us. It now does seem as though we were beginning to address the issues of the past review while beginning to work for the upcoming one. I do know that it wasn't entirely clear to me as a faculty member that Iin submitting material for my unit it was ultimately destined for SACs -- "


Stephen,


I studied the material and found that my unit does have an assessment plan.  I even recall seeing and approving the plan some time ago (1- 2 yrs?)  But I don't believe we have had time to use the plan to acquire any assessment data.  Some parts of the plan were even revised for 2004-2005.  If SACS is expecting 3 years of data it seems to me they will have to wait until about 2007. I hope I'm wrong. 


   



__________________
Punt

Date:
Permalink Closed

Kick

__________________
stinky cheese man

Date:
Permalink Closed

otherside--this is what i suspected. haven't been able to gather data, not aware of the issue, and so forth. institutional effectiveness is more than most here are accustomed to.

__________________
Otherside

Date:
Permalink Closed

quote:

Originally posted by: stinky cheese man

"otherside--this is what i suspected. haven't been able to gather data, not aware of the issue, and so forth. institutional effectiveness is more than most here are accustomed to."


I understand what you were saying Stinky Cheese Man (what a name!!)  It appears to me if you don't assess your program to see if it has quality, you can always call it "world class" in good conscience---"plausible denial".  Until you discover otherwise of course.



__________________
stinky cheese man

Date:
Permalink Closed

ironically, if you're referring to polymer science, they are probably better than the "average bear" at assessing program quality, even at the undergraduate level. as i said in this thread or another (i'm getting thread-worn) institutional effectiveness and assessment is a constant process. are our students learning what we want to, if so/not what do we do to improve or change? to me it's a basic item of what all academic departments ought to be doing at all levels (even doctoral).

__________________
Invictus

Date:
Permalink Closed

quote:
Originally posted by: stinky cheese man

"are our students learning what we want to, if so/not what do we do to improve or change? to me it's a basic item of what all academic departments ought to be doing at all levels (even doctoral)."


Isn't it really an "upward" extension of what the best teachers do at the classroom level anyway? When you give an exam or other assignment, yes, you assign grades to students, but you also look at what they missed (and how they missed it) to see what you can do better as a teacher next go-round. At least that's what I did when I taught.

At the department level, the measures get a little more global -- grade distributions in courses, withdrawal rates, students' grades in sequential courses as indicators of the effectiveness of the prerequisite, etc -- but the whole concept is the same. What can be done better next time?

Advancing to the college level, the measures are even broader -- enrollment, retention, graduation rates, licensure rates, that sort of thing -- but the conceptual process is no different.

A lot of this "missing documentation" is information that department chairs & deans ought to be using to plan for staffing, allocate resources, evaluate faculty performance & set up schedules. How is this stuff done at USM? Is it simply a matter of politicking or do deans do some degree of "scientific management?"

Isn't it ironic that an institution with a scientist as president isn't managed very "scientifically?"

__________________
stinky cheese man

Date:
Permalink Closed

invictus--you and i are singing from the same page of sheet music. (you pick it). at USM unfortunately, lots of the planning and assessment are not connected to budget. i talked to someone who has been through some SACS workshops and the connection to budget is crucial.. there have to be monetary links to the planning and assessment process.

__________________
COST faculty

Date:
Permalink Closed

quote:

Originally posted by: stinky cheese man

"ironically, if you're referring to polymer science, they are probably better than the "average bear" at assessing program quality, even at the undergraduate level. as i said in this thread or another (i'm getting thread-worn) institutional effectiveness and assessment is a constant process. are our students learning what we want to, if so/not what do we do to improve or change? to me it's a basic item of what all academic departments ought to be doing at all levels (even doctoral)."

What makes you think polymer science is better than average at assessment? They have no accrediting body.

__________________
stinky cheese man

Date:
Permalink Closed

i say that based on the fact that they know more about their undergrads than most departments do. if i am wrong, please correct me.

__________________
COST faculty

Date:
Permalink Closed

quote:

Originally posted by: stinky cheese man

"i say that based on the fact that they know more about their undergrads than most departments do. if i am wrong, please correct me. "

I don't want to get this thread too far off topic. I do not know why you say it is a "fact that  they "know more about their undergrads than most departments do". Can you speak for most departments? It is true that PSC follows their majors involving their undergrads in research very early on and they have one very regimented curriculum for their undergrads.  However, I am familiar with other departments in CoST that answer to accrediting bodies and track their undergrads very closely as well.

__________________
Invictus

Date:
Permalink Closed

quote:
Originally posted by: stinky cheese man

"invictus--you and i are singing from the same page of sheet music. (you pick it). at USM unfortunately, lots of the planning and assessment are not connected to budget. i talked to someone who has been through some SACS workshops and the connection to budget is crucial.. there have to be monetary links to the planning and assessment process."


"A plan with a budget is not a plan." One of the first "institutional effectiveness gurus" I ever met told me this back around 1989. There are different ways to approach it, of course, but the linkage has to be there. As I've said before, a lot of it is common sense stuff & USM probably does some of it without knowing it. Do deans review enrollments over time, historical adjunct staffing levels, etc., when they are asked to approve new full-time positions in departments under their supervision?

But there are plenty of "institutional effectiveness" decisions that aren't strictly tied to budget. When a department chair looks at a set of bad student evaluations or a horrible grade distro for an adjunct & decides not to employ that person in the future, that chair has closed the loop ... if the chair is working from a department objective that specifies a target for evaluation results or maximum withdrawal rate. (In this case, IE helps keep the department chair from being accused of making arbitrary & capricious decisions.)

I'm sticking to some very basic examples here, because that is really where 99% of the things that can make a real difference for students takes place. But if we want to get "global" about it, had USM been working from a real IE plan all along, the nursing department would be in a lot better shape right now & the CISE wouldn't be fretting about NCATE, because there would've been "global" budgeting priorities set on those things.

If there is a big "sell" for IE to the top brass, it shouldn't be fear of SACS. It should be that a bona fide system of planning, assessment & use of results lessens the need for "crisis management." Right now, USM appears (to the outside observer) to jump from one "crisis" to another without a lot of vision & direction. Maybe the top administration gets off on crisis management, I dunno, but it's really not a healthy thing for individual human beings or for organizations.

The big stumbling block to all this is that a lot of administrators cherish having "discretionary money" & can be awfully reluctant to have to let that money out based on standards, justification & demonstrated needs. Human nature wants to be able to use "power" (in this case, money) to reward friends & penalize enemies. In a good IE system, decision making (including budget) has to be divorced from personalities & hooked back into research-driven justification of needs.

That's one reason Shelby Thames needs to go back to the lab. He's forgotten what research-driven decision making is all about.


__________________
Googler

Date:
Permalink Closed

In HA article, Auburn says comparing its SACS problems with those at Southern Miss is like comparing "apples and oranges."


http://www.hattiesburgamerican.com/apps/pbcs.dll/article?AID=/20041212/NEWS01/412120307/1002


  and .  The fruitbasket turnover at Southern Miss helped create this latest mess, and the end result is an unwanted fruitcake.



__________________
Admin Toady

Date:
Permalink Closed

quote:

Originally posted by: Invictus

" "A plan with a budget is not a plan." One of the first "institutional effectiveness gurus" I ever met told me this back around 1989. There are different ways to approach it, of course, but the linkage has to be there. As I've said before, a lot of it is common sense stuff & USM probably does some of it without knowing it. Do deans review enrollments over time, historical adjunct staffing levels, etc., when they are asked to approve new full-time positions in departments under their supervision? But there are plenty of "institutional effectiveness" decisions that aren't strictly tied to budget. When a department chair looks at a set of bad student evaluations or a horrible grade distro for an adjunct & decides not to employ that person in the future, that chair has closed the loop ... if the chair is working from a department objective that specifies a target for evaluation results or maximum withdrawal rate. (In this case, IE helps keep the department chair from being accused of making arbitrary & capricious decisions.) I'm sticking to some very basic examples here, because that is really where 99% of the things that can make a real difference for students takes place. But if we want to get "global" about it, had USM been working from a real IE plan all along, the nursing department would be in a lot better shape right now & the CISE wouldn't be fretting about NCATE, because there would've been "global" budgeting priorities set on those things. If there is a big "sell" for IE to the top brass, it shouldn't be fear of SACS. It should be that a bona fide system of planning, assessment & use of results lessens the need for "crisis management." Right now, USM appears (to the outside observer) to jump from one "crisis" to another without a lot of vision & direction. Maybe the top administration gets off on crisis management, I dunno, but it's really not a healthy thing for individual human beings or for organizations. The big stumbling block to all this is that a lot of administrators cherish having "discretionary money" & can be awfully reluctant to have to let that money out based on standards, justification & demonstrated needs. Human nature wants to be able to use "power" (in this case, money) to reward friends & penalize enemies. In a good IE system, decision making (including budget) has to be divorced from personalities & hooked back into research-driven justification of needs. That's one reason Shelby Thames needs to go back to the lab. He's forgotten what research-driven decision making is all about. "


Agree 100%. I was reamed on another thread when I noted that the NCATE process has long been neglected here, which it has. And this applies to academic assessment in general. We do a mediocre job of assessing teaching at the classroom level, all the way to assessments at the institutional level (I think Brad put in overtime sweat equity to try to get us up to speed on some of this). When I examined some basic budget priorities across colleges and departments, such as graduate assistantship monies, I found inequities that were not data driven. Why? Well, the answer is that it has always been done that way--why let oneself be confused by  data?


Virtually all accreditation bodies have transitioned to outcomes or performance based assessments in the past decade (rather than input oriented assessments), and we have been behind the curve and are playing catch up across the board. What does this mean? It means that it is the department's, or unit's, or institution's responsibility to demonstrate to the accrediting body that a clear model exists, that we do what we say we are doing, and that the student outcomes are based on data and not opinion, reputation, or wishful thinking. And most important, to use those data to improve what we do. Stephen noted on a previous thread the dangers and pitfalls of performance based standards accreditation (especially when an institution is underfunded and the people are stretched to the limit). But this paradigm shift is here to stay. Time to take lemons and make some lemonade.


One of the biggest challenges is getting faculty and chairs on board. Faculty and chairs can get real ****ed when an e-mail requesting data that is needed yesterday arrives. Damn it, it looks just like a request I got last week! Unfortunately, until systems are in place at all levels to collect and manage data that produces reliable and valid information, such requests will go out all too often. In some ways, the SACS probation is a great news for us. It may (repeat may) energize everyone (students through upper admin) to where we need to be with data driven decision making at all levels.



__________________
stinky cheese man

Date:
Permalink Closed

since it's sunday morning--amen brother or sister. you've got it! however, expect lots of resistance from units used to getting what they want because of impression or reputation, even when the data may not support it. sadly, performance based assessment has been with SACS since '95.

__________________
«First  <  1 2 3 | Page of 3  sorted by
 
Quick Reply

Please log in to post quick replies.

Tweet this page Post to Digg Post to Del.icio.us


Create your own FREE Forum
Report Abuse
Powered by ActiveBoard