The legal authority and operating control of the institution are clearly defined for the following areas within the institution's governance structure:
3.2.2.1 the institution's mission;
3.2.2.2 the fiscal stability of the institution;
3.2.2.3 institutional policy, including policies concerning related and affiliated corporate
entities and all auxiliary services;
3.2.2.4 related foundations (athletic, research, etc.) and other corporate entities whose
primary purpose is to support the institution and/or its programs.
X Compliance
The off-site committee noted the following:
Section 102.03 of IHL Policies and Bylaws requires each institution to develop a distinct mission to be performed within the context of the Board and System missions. Section 102.04 requires a concise institutional statement of its core mission to be submitted to the Board for approval. A revised mission statement for the institution was approved by the Board in June 2005.
The Mississippi Constitution, Section 213 A, defines the duties and authority of the Board of Trustees including fiscal oversight. Section 37-101-15 gives authority to the Board to exercise control over the use, distribution, and disbursement of all funds, appropriations, and taxes for the operation and capital expenditures of state higher educational institutions including Southern Miss. IHL policies in sections 201 and 711 provide directions as to institutional financial reports, auditing procedure and requirements, and a uniform system of recording and accounting approved by the State Auditor. Section 301.0801 of IHL Policies and Bylaws, Duties of the Institutional Executive Officers, identifies fiscal management as a responsibility of the President. IHL policies 708.01 and 901.4 authorize student charges for auxiliary services and facility reserve funds and detail the processes for maintenance of those facilities. Four not-for-profit foundations support the academic, research, and athletic activities of the institution. IHL Policy 301.0806 defines the purpose of private foundations, theimportance of public confidence, provisions for a public, written agreement between the institution and the foundation, and the Board’s authority over the agents of the organizations. Operating agreements between the institution and the Southern Miss
Foundation, the Southern Miss Research Foundation, the Southern Miss Foundation for Southern Miss Athletics, and the Southern Miss Alumni Association have
not been approved by the Board of Trustees. (The agreements were to be submitted to the Board for approval at the September meeting. If approved, the institution will be in compliance with this standard.) Each foundation has its own mission statement, bylaws, nd governing board. However, the directors/executive directors of the 4 foundations are employees of the institution and under the administrative authority of the UniversityPresident.
On-site Review
At the time of the Off Site Review the institution had written operating agreements between itself and each of its four affiliated foundations, but the agreements had not been adopted by the Board of Trustees. In it Focused Report, the institution provided minutes of the Board of Trustees meeting of October 20, 2005 that show approval of the agreements for the Southern Miss Foundation, Inc., and the Southern Miss Athletic Foundation, Inc. The institution provided documentation that verifies the agreements between Southern Miss and its Alumni Association, and between Southern Miss and its Research Foundation, were approved by the Board of Trustees at their February 15, 2006 meeting.
3.3.1 The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
X Non-Compliance
The off-site committee noted the following:
The institution has recently embraced outcomes assessment, led by a newly-created Institutional Effectiveness department. The process includes annual student learning outcomes assessment for each program and outcomes assessments for each administrative support unit, as evidenced by the Academic Assessment reports and Administrative Assessment reports. Because this effort has only just been initiated, there is no evidence of improvement based on the analysis of the results, but there are plans in place for the use of the results. Academic program review has also been initiated as the result of a directive from the IHL. Accreditation reviews are used for programs where appropriate, and internal reviews are conducted of all academic programs not subject to accreditation. There is no evidence that a similar review process is in place for administrative and support units. However, it appears that the annual assessment report is used for this purpose.
On-site Review
The Focused Report indicates administrative and educational support units completed their own strategic plans to support the University’s strategic plan (p.36). Goals from these strategic planning documents appear in the 2005-06 annual assessment reports and link institutional goals to unit assessments, as presented in the Focused Report - Appendix A and on the Southern Miss IE web site (http://www.usm.edu/ie/). The 2005-06 Assessment Plans (http://www.usm.edu/ie/admin_assess_plans_05-06.htm) link unit outcomes to the 2005-08 Southern Miss Strategic Plan. This convergence clarifies the relationship between unit performance goals and institutional goals outlined in the Strategic Plan. On-site discussions with the Southern Miss institutional effectiveness key personnel provided additional insights into the integration of the strategic planning, program review, and outcomes assessment process for administrative and support units. Beginning in 2005, goals for administrative and support units were linked to division and strategic plan goals. All administrative and support
program efforts were included in this process. In both effect and practice, administrative and support programs are reviewed every three years in conjunction with the strategic planning process.
As with the academic units, the 2005 strategic plan process for administrative and support units is too new to have results for use as evidence of program improvement. Almost all of the administrative and support units have links to their assessment reports for 2004-05 on the Southern Miss IE web site
(http://www.usm.edu/ie/admin_assess_rpts_04-05.htm). Based on a review of a sample
of units' reports from each VP area, the quality of the 2004-05 Assessment Reports varies considerably across units. A recurring weakness in the assessment reports is a lack of documented use of results for program improvement. The unit plans typically established objectives, measures and documented results. However, even though they completed the column labeled "use of results" on the assessment report, they do not
indicate how the results were used for program improvement (examples include: Athletics, Learning Enhancement Center, AA/EEO, and the Center for Research Support.) Conversely, several units presented very good plans that presented measurable objectives, criteria for success, results of their measurements, and actions that resulted from those assessment results (examples include: Institutional Research, Libraries, Post Office, Dining Service, and the Registrar.) As part of the institution’s ongoing program review and assessment efforts, units with insufficient outcome measures will need additional institutional support, training, and monitoring.
The institution has made significant progress through the development of its 2005 strategic plan and related assessment plans. The acquisition of a dedicated software package to support assessment data collection, monitoring, and reporting will strengthen the process and the use of assessment results for continuous improvement. Notwithstanding all of the planning and assessment efforts, the standard calls for the
institution to provide evidence of improvement based on assessment results. These results will not be available until later this year. Information presented in the assessment reports as preliminary February 2006 results did not provide sufficient evidence of improvement based on assessment results.
Recommendation 7: The Committee recommends the institution provide evidence
of program improvements for academic and administrative and support units
3.5.1 The institution identifies college-level competencies within the general education core
and provides evidence that graduates have attained those competencies.
X Non-Compliance
The off-site committee noted the following:
The institution revised its General Education Curriculum (GEC) and established a new set of outcomes for students, beginning in 2003. The GEC identified five broad principles and 14 core student learning outcomes. However, the core outcome statements are vague and very broad and do not appear to identify attainable and measurable outcomes appropriate for college students. Thus, the off-site committee could not determine that
the institution has identified college-level competencies within its general education core. Furthermore, the institution presents no evidence of the alignment between these broad and somewhat vague outcome statements and the test it has chosen to administer, the Collegiate Assessment of Academic Proficiency (CAAP). The University discusses standardized testing of math, science, critical thinking, social science, reading and writing through the CAAP and its relationship to ACT, but does not indicate how these areas
specifically relate to each of the student learning outcomes. Furthermore, while the comparison of the scores of Southern Miss graduates to those of students at other institutions demonstrates a general achievement of college-level learning, the lack of any demonstrated alignment renders such results almost meaningless in terms of indicating the achievement of the institution’s stated general education outcomes. Further, the institution does not provide student learning outcome objective statements with targeted levels of accomplishment. It should be noted that the use of grades is not an effective method for measuring student learning outcomes.
On-site Review
In the Focused Report, Southern Miss subdivides the core’s 14 objectives into specific and measurable objectives aligned with core courses (Appendix L.5). These subdivisions address the first concern of the off-site committee about the vague and broad nature of the competencies.
Southern Miss proposes making use of the CAAP to measure attainment of college-level competencies in the general education core. Southern Miss administered the CAAP in Spring 2005 to approximately 700 students (Appendix L.1). CAAP assessments are matched with “old” GE core items for students who participated in the spring 2005 testing (Appendix L.2). CAAP will be administered in spring 2006 to students who started in
2003 under the new GE core. Appendix L.3 matched CAAP assessments with new GE areas. CAAP assessments do not provide data for all areas of the general education objectives, limiting their usefulness in documenting attainment of general education competencies. Information in appendices L1 and L3 relate to the second concern of the off-site committee.
The Focused Report goes on to present a matrix (Appendix L.6) that matches 2004-05 learning general outcomes with courses, assessments, results, and use of results. The matrix does not match the subdivided outcomes with courses, making it difficult to determine the specific outcomes being assessed. Further, many of the learning outcomes in Appendix L.6 do not show the use of the assessment results for program
improvement (ex., SLO#1/CHE106, SLO#2/GHY104 & 105, SLO#3/GYL103). Further, units did not report any use of assessment results when measures exceeded an objective’s target level. (ex., SLO#1/GHY104 & 105, PHY210, PSC190, DAN130CHE107).
The Focused Report provides an extensive matrix of learning outcomes, responsible courses, and evaluation methods that will be assessed during 2005-06 in Appendix L12. As with the 2004-05 assessments, the matrix uses the general outcomes and not the more specific subdivided outcomes in Appendix L5. Many of the assessment measures lack criteria for success and continue to depend on simple rates/frequencies of activities without true measures of student success or failure. Many of the measures still depend on grades (actual letter grade or percentage score) for class assignments. These measures typically do not provide evidence of specific student skills/learning outcomes. Many of the assessment measures simply state students will prepare “satisfactory” class assignments with no definition of how this will be defined or how it relates to measurement of the learning outcome. The information presented in Appendices L.1 – L.6 answers the off-site committee’s first, second, and third concerns about the relationship between GE competencies, the CAAP and the GE courses. The information in the Focused Report does not adequately address the fourth concern about a lack of targeted levels of accomplishment of GE outcomes. Scoring rubrics used across courses in some disciplines (e.g., Appendix L12, Outcome 1.a/ COH100, HIS102) provide specific, measurable demonstrations of actual student attainment of skills or knowledge. Measurable results are not provided for many outcomes (ex., 1.a/ANT101, PHI151, BSC110L). The off-site review noted appropriately that use of grades as evidence of attainment of specific educational outcomes is not sufficient. Notwithstanding all of the planning and assessment efforts, the University has not met the standard to provide measurable college-level general education competencies and evidence of attainment of those competencies across the University.
Recommendation 9: The Committee recommends the institution establish learning outcome measures that demonstrate attainment of all general educational outcomes and present evidence that students have achieved these learning outcomes.
3.7.5 The institution publishes policies on the responsibility and authority of faculty in academic
and governance matters.
X Compliance
Off-Site Committee Comments:
In its Faculty Handbook, the institution publishes clear guidelines for the participation of the faculty in academic and governance matters. There are several major standing committees and councils whose membership and roles in academic and governance matters are described in the Faculty Handbook and which are appropriate for shared governance.
On-Site Committee Comments:
During the On-Site Committee visit, questions did arise from some faculty members
regarding shared governance, although non-compliance with this standard was neither
alleged nor demonstrated. While policies in the faculty handbook concerning the matter
might be viewed as having broad latitude for interpretation, the committee found no
evidence of actions inconsistent with the policy. The committee understands that a
revised policy that provides more clarity and specificity vis-à-vis the intent of the
institutional mission statement pertinent to the matter is on the verge of implementation.
…
3.10.5 The institution maintains financial control over externally funded or sponsored research and programs.
X Non-Compliance
The off-site committee noted the following:
The institution provided a link to the Office of Research and Economic Development and the Office of Contracts and Grant Accounting websites. The various policies and procedures presented, including their Policy for Compliance with Cost Accounting Standards, confirm that the institution understands the need for appropriate control over sponsored research expenditures. Sponsored program activities at the University are audited annually by state auditors, and the FY2004 audit identified two immaterial internal control deficiencies directly
related to sponsored programs: late financial reporting of two grants and inadequate control of equipment. The on-site review committee is encouraged to review the FY2005 audit report to determine if the correction plans implemented satisfied the concerns of the Office of State Auditor.
On-site Review
The FY2005 audit report, which is needed to determine if the correction plans satisfied the concerns of the Office of State Auditor, has not been provided. The on-site review committee hopes that the institution will encourage the Office of State Auditor to release the audited financial statements and management letter as soon as practicable in order that a determination of compliance can be made. A draft of the FY05 management letter was submitted by the institution and shows three deficiencies in internal control that were deemed to be reportable conditions: one instance of excessive indirect costs charged to a federal program and two instances of inadequate control of equipment. This particular finding was a repeat of a similar finding in FY04, albeit for a different grant. While the committee would not usually act formally on a draft letter, this letter does contribute to the committee’s concern regarding this issue. The committee believes that the institution should develop and implement an institution-wide equipment and inventory management system that complies with federal
and state requirements, and that it should develop and implement a policy that correctly calculates its indirect cost recovery on grants.
Recommendation 11: The committee recommends that the institution demonstrate an adequate response to management letter comments regarding reportable conditions and demonstrate implementation of effective corrective actions
…
3.10.7 The institution operates and maintains physical facilities, both on and off campus, that are adequate to serve the needs of the institution's educational programs, support services, and other mission-related activities.
X Non-Compliance
The off-site committee noted the following:
The institution provided the Gulf Coast Campus Master Plan, a comprehensive document completed in 2004, which identifies the facilities needed to serve that campus through 2009 taking into consideration the existing facilities, projected enrollment growth, anticipated growth in academic programs, and required increases in faculty and staff. This report identifies specific areas that are deficient in quantity and quality, and recommends targeted property acquisitions as well as new construction to meet the defined needs. However, the impact of Hurricane Katrina on the Gulf Coast campus was not addressed in the materials provided. Further, the information provided for the Hattiesburg campus was insufficient to determine compliance. The Physical Plant Department is responsible for maintaining the facilities in a condition appropriate to their use. The department consists of the appropriate units and has established adequate policies and procedures to accomplish its mission.
On-site Review
The institution provided adequate information for both the off-site and the on-site review committees to determine that the structure and staffing of the Physical Plant unit operates and maintains physical facilities appropriately. While the standard does not require a Facility Master Plan, such a plan for the Gulf Coast Campus showed compliance with this standard, for that site. The institution’s plan for a similar document for the Hattiesburg campus, which was scheduled to be updated in 2005, has been delayed due to late release of the 2005-06 budget by IHL, followed by a major hurricane weeks later. This plan, or a similar document that reviews and plans facilities, would have bolstered the institution’s argument that its facilities are adequate to
serve the needs of the institution’s programs on the main campus, although the institution did provide an analysis of campus space using assignable square feet per student full-time equivalent which also included comparisons to national standards. The institution provided information to the on-site review committee on the establishment of a Master Facility Planning Committee, and an RFP for professional planning services.
Proposals are required to be submitted by April 7, 2006, the selected firm is to be announced by fall 2006, and a final plan is to be presented to the campus and the public in March 2007. The visits to the Gulf Coast sites raised some concerns on the part of the committee that the temporary facilities are currently very crowded and surely will not be adequate to meet the needs of the number of programs and the student body projected at this temporary site. The committee is sympathetic to the challenges that Southern Miss has faced and is aware that Southern Miss has made significant progress in restoring some normalcy for the students and faculty on the Gulf Coast. However, the committee believes strongly that a comprehensive plan should be developed very quickly to ensure that the facilities are adequate to fulfill the institution’s mission.
Recommendation 12: The committee recommends that the institution review and update the existing facility master plan for the Gulf Coast Campus and incorporate the effects of Hurricane Katrina into the plan.
Let's see. deCasal gets canned before the BOE reports on NCATE , and Exline gets a raise with this mottled SACS re-affirmation report (especially after receiving unprecedented technical and consulting support, cash resources, and the unqualified efforts of the entire faculty and staff)? Admittedly, both women inherited a mess, but there seems to be some asymetry here.
If the big boss was paying the right attention to both processes from day one, we wouldn't be in this doo.
Why is it that Dr. Ken Malone is not on the following list?
Worksheet for Reporting Non-Compliance with Faculty Qualifications
- For Use with Visiting Committees -
Institution: University of Southern Mississippi
CS 3.7.1 Faculty Qualifications:
While the committee provided an evaluation that the institution is in compliance with Comprehensive
Standard 3.7.1, it still has some concerns about the documentation of the following four faculty members.
Name of Faculty Member Department or Reason for questions Courses Taught
1. Pamela Jones, Foreign Language.
It is not clear that Dr. Jones has relevant qualifications for teaching foreign language. The institution indicates that Dr. Jones is not teaching language courses this spring. It is not clear whether or not Dr. Jones will teach in subsequent terms.
2. Shelton Houston, Computing M.S. in industrial and vocational education;
Ph.D. in Education; dissertation on collaborative research and scholarship in electronics and technology. Courses seem to focus on technical aspects of computer technology, including networks, electronics assembly, pc hardware. The relevance of graduate education in education-related fields for these technical courses is not clear.
3. Margaret Lockhead, UNIV 1001 Preparation for teaching at college level is not clearly
documented.
4. Diane Coleman, HPR 309 Certification in first aid. Qualifications for teaching at the
college level are not clearly documented.
CS 3.4.13 Program Coordinators:
While the committee provided an evaluation that the institution is in compliance with Comprehensive Standard 3.4.13, it still has some concerns about the documentation for the following two program coordinators.
Donald Cabana has replaced Lisa Nored, J.D., as program coordinator for the MA/MS Administration of Justice: Juvenile degree program. Dr. Cabana’s has earned an M.S. degree in criminal justice and a Ph.D. in adult education. The relevance of the doctorate in adult education for administration of justice is not immediately apparent. Additional information indicates Dr. Cabana has graduate education that emphasizes the adult learner and methods and strategies of adult learning. In addition, Dr. Cabana has been warden of an adult prison and commissioner of corrections. The committee is unclear about the relevance of these experiences in adult education and justice for the role of coordinator of the MA/MS administration of justice: juvenile program.
Todd Adams has replaced Shelton Houston as coordinator of the BS computer engineering technology program. The Focused Report identifies the replacement as “Dr.” Todd Adams. However, the documents provided indicate that Mr. Adams has an M.S. degree in engineering technology. Mr. Adams is described as having “long experience in industry and academia.” His vita indicates that he served as an electronics technician for Hooper’s Electronics and Empress Audio before spending approximately 10 years as a consultant. He has served in a faculty position at Southern Miss since 1997. It is not clear from the given documentation that these experiences provide the basis for service as a program coordinator nor that Mr. Adams has a doctoral degree.
Question wrote: Why is it that Dr. Ken Malone is not on the following list? Because economic development at the University of Southern Mississippi is defined as polymer science?
You must be joking, Obvious. SACS can see from USM's website that Malone isn't in the Polymer Science Dept. Considering the others on the list, Malone should have stood out like a sore thumb.
We're actually thinking alike here, LeftASAP, but I must not have expressed myself clearly. I was facetiously suggesting there is the department of polymer science and then there is the department of applied polymer science, dubbed economic development.
Actually, I don't understand why this one wasn't red-flagged by SACS.
Does "Carole" refer to the former Assist. Dean of the erstwhile CoEP? Colorado was where Janice Thompson was headed to serve as a public school superintendant until life took a very bad turn for her.
I know the Faculty Senate met with the SACS committee, so they were told about problems. From what the committee wrote below they must have been deaf or blind if they think USM has Shared Governance. It should be "Non-compliance" until the document is implemented.
"3.7.5 The institution publishes policies on the responsibility and authority of faculty in academic and governance matters.
X Compliance
On-Site Committee Comments:
During the On-Site Committee visit, questions did arise from some faculty members regarding shared governance, although non-compliance with this standard was neither alleged nor demonstrated. While policies in the faculty handbook concerning the matter might be viewed as having broad latitude for interpretation, the committee found no evidence of actions inconsistent with the policy. The committee understands that a revised policy that provides more clarity and specificity vis-à-vis the intent of the institutional mission statement pertinent to the matter is on the verge of implementation."
The FS members were put in a difficult position. They were asked directly if the university was or was not in compliance. They thought it was best not to risk a bad review right before the search for a new president. The SACS people fully understood the situation and elected to look the other way. It was their call, and they blinked.
The FS members were put in a difficult position. They were asked directly if the university was or was not in compliance. They thought it was best not to risk a bad review right before the search for a new president. The SACS people fully understood the situation and elected to look the other way. It was their call, and they blinked.
i agree and disagree with shrug. the faculty reps were put in an uncomfortable situation. but unless they were willing to provide documentation of violations, there is nothing the visiting team can do. and i disagree with shrug's conclusion that the team blinked. if the faculty reps were unwilling to claim a violation and would not document violations, then the visiting team can do the nothing. the burden of proof is on those making the claim of violation.
left--since you've left, it's easy to second guess the actions of others. if you are even in a similar position, i'd like to see what you'd do.
i agree and disagree with shrug. the faculty reps were put in an uncomfortable situation. but unless they were willing to provide documentation of violations, there is nothing the visiting team can do. and i disagree with shrug's conclusion that the team blinked. if the faculty reps were unwilling to claim a violation and would not document violations, then the visiting team can do the nothing. the burden of proof is on those making the claim of violation. left--since you've left, it's easy to second guess the actions of others. if you are even in a similar position, i'd like to see what you'd do.
I'm not second guessing the Senate reps, Stinky. It now see it was a difficult decision and they had to make the call for what they thought was best for USM in the big picture. Thanks for the explanations.
Well, the visiting team gave USM Non-Compliance on others items until the "paper work" gets done. They refer to a document which will clarify the Shared Governance policy when adopted. So I thought they could have put pressure on getting the policy into action by using "Non-Compliance. I agree these were hard decisions for all sides.
"The committee understands that a revised policy that provides more clarity and specificity vis-à-vis the intent of the institutional mission statement pertinent to the matter is on the verge of implementation."
stinky cheese man wrote: and don't second guess the visiting team either. Well, the visiting team gave USM Non-Compliance on others items until the "paper work" gets done. They refer to a document which will clarify the Shared Governance policy when adopted. So I thought they could have put pressure on getting the policy into action by using "Non-Compliance. I agree these were hard decisions for all sides. "The committee understands that a revised policy that provides more clarity and specificity vis-à-vis the intent of the institutional mission statement pertinent to the matter is on the verge of implementation."
Please note that the opening sentence of the comments says that the committee met with "some" faculty members. In fact, it met with the exective committee of the faculty senate -- a body elected by the members of the senate who are themselves elected to represent the faculty. If that isn't soft peddling the issue by misuse of language then I don't kow what is. In the earlier section on faculty responsbility for curriculum and instruction, the commitee noted that the "university" presented adequate evidence that faculty is "primarily" involved.
So the commitee neatly sets up the legitimate interests of "the university" (i.e. "the administraton") vs "a few faculty members" -- allowing it to deligitimize faculty concerns RE governance.
I believe the faculty senate exec committee was deeply disappointed that the committee failed to acknowlege serious problems in the governance process. SCM, the committee presented "evidence" but was unwilling to go as far as to declare the university non-compliant, feeling as though that was the visiting committee's call -- and frankly, to put the faculty here in the role of determining compliance was irresponsible on the part of the visitors. Unless your head (speaking of the committee, not you personally) buried deeply in the dirt, the public record of problems here is huge and accessible -- it is impossible to concieve that members of the committee were not aware of the issue.
Looking at the report, it is clear that a number of fast ones were pulled over the committee. I'm not satisfied, given the makeup of the committee, that they weren't willfully blind. For instance, praising the QEP for being "faculty generated" is a crock. The original QEP subject was technology and it was put up by Joan and her crew -- over the concerns of many faculty (at that time I was on academic council and remember this discussion quite clearly). It was the SACs consultant who pointed out that this was not a good selection -- at which point our desperate VP in charge of accreditation then brought the faculty in. For the committee to pass over this poor beginning in the process suggests to me that it was either duped or was quite willing to allow itself to be duped.
Looking at the report, it is clear that a number of fast ones were pulled over the committee. I'm not satisfied, given the makeup of the committee, that they weren't willfully blind.
Please note that the opening sentence of the comments says that the committee met with "some" faculty members. In fact, it met with the exective committee of the faculty senate -- a body elected by the members of the senate who are themselves elected to represent the faculty. If that isn't soft peddling the issue by misuse of language then I don't kow what is. In the earlier section on faculty responsbility for curriculum and instruction, the commitee noted that the "university" presented adequate evidence that faculty is "primarily" involved.
In the meetings I attended, there was a great deal of concern on the part of the visiting team as to the amount of input the faculty at large had with regard to the general ed curriculum. They returned to this topic several times, with questions from several members. There was no mention of shared governance as a whole, apparently that was covered in other meetings.
i'm a bit astonished the visiting team would question the amount of faculty involvement in the general education area. we've had committees since the fleming administration. i sometimes feel the general education curriculum is an example of faculty involvement run amuck. now if they are questioning whether the faculty at large were involved, that's a good question. the faculty at large haven't, in general, been involved in the general education curriculum. but that's an outgrowth of the previous administration. right now, we're dealing with a mess in that area.
It's rather difficult to get disaccredited by SACS. Whatever the final outcome of the SACS assessment, the next USNews and World Report ratings will tell us what we really need to know - what our peers nationally think of us.
stephen judd wrote: Please note that the opening sentence of the comments says that the committee met with "some" faculty members. In fact, it met with the exective committee of the faculty senate -- a body elected by the members of the senate who are themselves elected to represent the faculty. If that isn't soft peddling the issue by misuse of language then I don't kow what is. In the earlier section on faculty responsbility for curriculum and instruction, the commitee noted that the "university" presented adequate evidence that faculty is "primarily" involved. In the meetings I attended, there was a great deal of concern on the part of the visiting team as to the amount of input the faculty at large had with regard to the general ed curriculum. They returned to this topic several times, with questions from several members. There was no mention of shared governance as a whole, apparently that was covered in other meetings.
My example was intended to show that the committee was inclined to conflate the "university" with "administration", while failing to note that the group of faculty they met with concerning shared governance was an elected body representing the faculty, which is a significant portion of the university.
My experience with SACs, unlike my area accrediting body, reinforces my sense that there is a growing gap between those who administer and those who practice teachng and research. The SACs visitors, in my view, brought an admnistrative culture with them -- and they saw exactly what that culture tends to see.