Members Login
Username 
 
Password 
    Remember Me  
Post Info TOPIC: SACS Data
Deep Throat

Date:
SACS Data
Permalink Closed


UNIVERSITY OF SOUTHERN MISSISSIPPI


CAAP Analysis Highlights


Spring 2005

 


Purpose


The purpose of this study was to pilot using the Collegiate Assessment for Academic Proficiency (CAAP) exam to provide information about learning outcomes associated with the Core curriculum. A secondary purpose of CAAP testing is to identify any problems that should be addressed with community colleges within the state- wide articulation agreement.


Methodology


The CAAP exam was selected by the General Education Committee of Academic Council and administered by Institutional Effectiveness.  CAAP is a set of standardized tests developed to test college skills. There are 6 components to the CAAP test – Critical Thinking, Mathematics, Reading, Science, Writing Essay, and Writing Skills (descriptions are available at the end of this report). Administration of


the CAAP tests took 40 minutes for testing with an additional 10- 15 minutes for demographic data collection. Six upper division courses in each of the colleges on the Hattiesburg campus and 6 upper division courses on the Coast campus (covering all colleges) were selected as the sample of students.


Courses in the sampling frame were identified based on the number of transfer students (transferring more than 30 hours) and the number of native students (transferring less than or equal to 30 hours, including 0 transfers hours) in the course. Deans were also asked to provide a listing of professors who may be willing to allow their students to take the test during class time. These professors were contacted by the Director of Institutional Effectiveness to see if they would be willing to participate in the testing. Testing was conducted on the Coast campus during the week of March 21 and on the Hattiesburg campus during the week of March 28.


 


Key Findings Comparing Local and National CAAP Scores


•Scores are comparable in 3 of the 6 main components (Reading,


Science and Writing Skills).


•USM students have significantly higher scores in Critical Thinking.


•USM students have significantly lower scores in the Math and Essay components.


Comparing Transfer to Native USM Students


•Scores are comparable in 5 of the 6 main components.


•Native USM students have significantly higher scores in the Essay component.


Comparing Hattiesburg to Gulf Coast


•Scores are comparable in 4 of the 5 main components.


(The Essay test demographic section does not allow for differentiation between campuses.)


•Gulf Coast students have significantly higher scores in Science.


Comparing Colleges


•There are significant differences between the scores among the five colleges in most areas.


 


(The Essay test demographic section does not allow for differentiation between campuses.)


•The College of Science and Technology has significantly higher scores in Writing when compared to the College of Health and the College of Education and Psychology;


in Critical Thinking when compared to the College of Business and the


College of Education and Psychology; and in Science when compared to the College of Education and Psychology.


•The College of Arts and Letters has significantly higher scores in Reading when compared to the College of Health.


•The College of Business has significantly higher scores in Writing when compared to the College of Health and in Science when compared to the College of Education and Psychology.


•The College of Education and Psychology has significantly higher scores in Reading when compared to the College of Health.


 


Comparing ACT-CAAP Linkage

•USM students did better than the Reference Group in Lower than Expected Progress in all 4 comparable components.


•USM students did better than the Reference Group in Expected Progress in all 4 comparable components.


•USM students did better than the Reference Group in Higher than Expected Progress in 2 areas (Reading and Science).



__________________
Reorganization needed

Date:
Permalink Closed

These data are very interesting and potentally important.  Nonetheless, the various colleges are quite heterogeneous. Some of the departments are inappropriately housed. Are the data also available by department?  College- level data might be of interest to SACS, but departmental- level data would be more helpful to the university.

__________________
Deep Throat

Date:
Permalink Closed

Reorganization needed wrote:


These data are very interesting and potentally important.  Nonetheless, the various colleges are quite heterogeneous. Some of the departments are inappropriately housed. Are the data also available by department?  College- level data might be of interest to SACS, but departmental- level data would be more helpful to the university.


The data is in graph and table format that won't copy and paste to the board.  I will try to supply what data I can.  However, you won't find "department-level" data because it wasn’t done that way. Only a few professors volunteered to have their class time used for this survey.  For example, a Math class would have students from all over the university taking the class and the survey. 



__________________
Data Hound

Date:
Permalink Closed

Deep Throat wrote:
you won't find "department-level" data because it wasn’t done that way. Only a few professors volunteered to have their class time used for this survey.  For example, a Math class would have students from all over the university taking the class and the survey. 

It's too bad the response sheets weren't coded by the students' majors and other dimensions. That way some very enlightening analyses could be conducted. Each department really needs to know how their own majors performed.

__________________
astonished

Date:
Permalink Closed

BE VERY CAREFUL MAKING ANY CONCLUSIONS OR INFERENCES FROM THE CAAP DATA!!!!!


Look carefully at the methodology.  Each test was given to a different class in the various colleges.  None of the classes were in the common core for each college.  Thus you would, for example, get only economics majors in a upper level class taking the quant portion of the exam, or upper level accounting students taking the critical thinking.  Also, look at the cell sizes.  The only thing learned from this exam is about the process of administering it.


Releasing these results without the appropriate caveats is how we get a reputation for "junk" research.



__________________
Invictus

Date:
Permalink Closed


Data Hound wrote:

Deep Throat wrote:you won't find "department-level" data because it wasn’t done that way. Only a few professors volunteered to have their class time used for this survey.  For example, a Math class would have students from all over the university taking the class and the survey. 
It's too bad the response sheets weren't coded by the students' majors and other dimensions. That way some very enlightening analyses could be conducted. Each department really needs to know how their own majors performed.




Odds are they were. However, ACT uses its own major-code system & it's quite complicated to disaggregate department-level results. (It's possible to specify special break-out groups, but it's a cost-per comparison gig.)

If you're that hot to do a dissertation, see if Dr. Exline's office has the data disk. It's a standard component of what ACT returns to the institution. But trust me, it will take a lot of manhours & a fair amount of SPSS expertise to do it.

__________________
Outcome

Date:
Permalink Closed

Invictus wrote:


ACT uses its own major-code system & it's quite complicated to disaggregate department-level results. (It's possible to specify special break-out groups, but it's a cost-per comparison gig . . . . But trust me, it will take a lot of manhours & a fair amount of SPSS expertise to do it.

I suppose that's why some of the smaller liberal arts colleges that are committed providing a good education to their students require that graduating seniors take the "advanced" (subject matter) section of the GRE. That way the departments can determine to what extent their majors are learning the content of the discipline.

__________________
Invictus

Date:
Permalink Closed


Outcome wrote:

I suppose that's why some of the smaller liberal arts colleges that are committed providing a good education to their students require that graduating seniors take the "advanced" (subject matter) section of the GRE. That way the departments can determine to what extent their majors are learning the content of the discipline.



Bear in mind that in those cases, the student is generally charged the testing fee. If I'm correct about the USM institutional effectiveness application, the university assumed the testing costs. I might also note that this was a "sample" & it's quite likely (probable) that the sample sizes within any given department would be so small that when the results reflected negatively on a department, everyone here would be screaming "foul!"

This is the Catch-22 with outcomes assessment: either we increase the costs to students or we increase the costs to the institution. If we use a sample-based design, then smaller departments are at the mercy of the stratified random design. (I'm not suggesting, BTW, that USM used a stratified random design. It may not have used a random design at all. Does anyone have clarification on the methods used to select the "testees"?)



__________________
Petunia

Date:
Permalink Closed

Invictus wrote:


This is the Catch-22 with outcomes assessment

A Catch-22 perhaps, but maybe not the type of Catch-22 you are thinking about. We tried that with nursing with disastrous results. Maybe USM doesn't want any record of what its students know.


__________________
Page 1 of 1  sorted by
 
Quick Reply

Please log in to post quick replies.

Tweet this page Post to Digg Post to Del.icio.us


Create your own FREE Forum
Report Abuse
Powered by ActiveBoard