Institutional Research & Effectiveness administers university-level assessments, including national surveys and standardized tests. The staff in Institutional Effectiveness are happy to help each academic program and academic support unit identify their target populations in the results of these instruments and provide guidance on how to incorporate these results into the annual IE report process.
The Office of Institutional Effectiveness responsibilities include tasks pertaining to both assessment and institutional effectiveness. OIE conducts and facilitates assessment testing campus-wide. Assessment results are shared with the faculty, appropriate administrators, and other relevant offices and are used to enhance student learning and strategic planning.
Students are asked to complete the exit surveys at the same time that they apply for graduation. Survey results are compiled by semester and then rolled up into one academic year. The following results provide information at the University level, and each academic program receives a report of its graduates during the year. Many of the academic programs use these results in their annual IE reports.
|Undergraduate Exit Survey Summary||Graduate Exit Survey Summary|
Formerly known as student evaluation of teaching, the course surveys are deployed through Scantron's Class Climate software at the end of every semester. In October 2021, Faculty Senate approved a new set of questions for a revised course survey. The overall results for each semester will be provided at this location going forward:
|Fall Overall Results||Spring Overall Results|
For the first time in Spring of 2022, faculty are being given the opportunity to add up to three questions to their student course surveys. The question bank has been broken into four categories: Instructional Design, Instructional Delivery, Instructional Assessment, and Course Impact. To preview the question options, use the four drop downs below. Faculty will have options to add questions to each course section they teach. The additional questions can be different for each course. Faculty will be sent a link to add the three optional questions via email from ClassClimate (see the instructions on how to add the optional questions to your course surveys).
Optional Question Instructions.
|The course objectives were well explained.|
|The course assignments were related to the course objectives.|
|The instructor related the course material to my previous learning experiences.|
|The instructor incorporated current material into the course|
|The instructor made me aware of the current problems in the field.|
|The instructor gave useful writing assignments.|
|The instructor adapted the course to a reasonable level of comprehension.|
|The instructor exposed students to diverse approaches to problem solutions.|
|The instructor provided a sufficient variety of topics.|
|Prerequisites in addition to those state in the catalog are necessary for understanding the material in this course.|
|I found the coverage of topics in the assigned readings too difficult.|
|The instructor gave clear explanations to clarify concepts.|
|The instructor's lectures broadened my knowledge of the area beyond the information presented in the readings.|
|The instructor demonstrated how the course material was worthwhile.|
|The instructor's use of examples helped to get points across in class.|
|The instructor's use of personal experiences helped to get points across in class.|
|The instructor encouraged independent thought.|
|The instructor encouraged discussion of a topic.|
|The instructor stimulated class discussion.|
|The instructor adequately helped me prepare for exams.|
|The instructor was concerned with whether or not the students learned the material|
|The instructor developed a good rapport with me.|
|The instructor recognized individual differences in students' abilities.|
|A warm atmosphere was maintained in this class.|
|The instructor helped students to feel free to ask questions.|
|The instructor demonstrated sensitivity to students' needs.|
|The instructor tells students when they have done particularly well.|
|The instructor used student questions as a source of discovering points of confusion.|
|The instructor's vocabulary made understanding of the material difficult.|
|The instructor demonstrated role model qualities that were of use to me.|
|The instructor's presentations were thought provoking.|
|The instructor's classroom sessions stimulated my interest in the subject.|
|Within the time limitations, the instructor covered the course content in sufficient depth.|
|The instructor attempted to cover too much material.|
|The instructor moved to new topics before students understood the previous topic.|
|The instructor encouraged out-of-class consultations.|
|The instructor carefully explained difficult concepts, methods, and subject matter.|
|The instructor used a variety of teaching approaches to meet the needs of all students.|
|The instructor found ways to help students answer their own questions.|
|The instructor asked students to share ideas and experiences with others whose backgrounds and viewpoints differ from their own|
|The exams were worded clearly.|
|Examinations were given often enough to give the instructor a comprehensive picture of my understanding of the course material.|
|The exams concentrated on the important aspects of the course.|
|I do not feel that my grades reflected how much I have learned.|
|The grades reflected an accurate assessment of my knowledge.|
|The instructor was readily available for consultation with students.|
|The criteria for good performance on the assignments or assessments were clearly communicated.|
|This course challenged me intellectually.|
|The course helped me become a more creative thinker.|
|I developed greater awareness of societal problems.|
|I felt free to ask for extra help from the instructor.|
|I learned to apply principles from this course to other situations.|
|This course challenged me to think critically and communicate clearly about the subject.|
|The community engaged learning component of this course enhanced my understanding of course content.|
Members of the Office of Institutional Research and Effectiveness administer this test to first-year students in the fall and senior students in the spring, representing all colleges at the university. This test evaluates four core skill areas: reading, writing, critical thinking, and mathematics, along with context-based areas of humanities, social sciences, and natural sciences. MSU’s results are also benchmarked against our Carnegie peers. Individual ETS reports share results for several academic programs that use these data in their annual IE reports. Furthermore, the university-level results are used as part of the University’s General Education program.
The ETS Proficiency Profile scores first-year and senior students based on their proficiency in reading and critical thinking, writing, and mathematics across three levels of competency (Level 3 being the most difficult level). For the purposes of MSU's assessment, all seniors, including those who have transferred in coursework, are able to complete the exam; however, only those seniors who have not transferred coursework are used as our comparison with first-year students. Each year, MSU evaluates what percentage of our first-year and non-transfer seniors students are proficient in reading, writing, and mathematics. These percentages can be compared to our Carnegie peers in the R1 and R2 classifications. The table below provides the average proficiency of MSU students compared to Carnegie peers from 2012-2017. As is evidenced in these data, MSU's first-year students start at a level below our Carnegie peers; however, MSU seniors surpass our Carnegie peers in all areas except for Critical Thinking and Level 3 Mathematics.
The NSSE measures students’ engagement with coursework and studies and how the university motivates students to participate in activities that enhance student learning. This survey is used to identify practices that institutions can adopt or reform to improve the learning environment for students. Each year, the Office of Institutional Research and Effectiveness deploys the online survey to freshmen and seniors. The institution then compares its results to a group of peers from the same Carnegie classification, from a peer group determined by the NSSE examiners, and from a group of peers that MSU has identified. Individual reports are created for units that use the results for their annual IE reports. Certain questions are utilized in several university-wide assessment activities, such as the Maroon & Write Quality Enhancement Plan and the General Education Assessment Plan.
Last Updated: April 07, 2022