Program Assessment

Program-Assessment Procedure

Spring Hill College assesses every academic program (majors, certificates, and some minors) every year according to the following schedule.

An "academic year" is understood to run from 1 July to 30 June.

Each program designates an Assessment Liaison (or more than one) whose job it is to communicate with Institutional Review and Assessment in submitting these documents. However, assessment is a program-wide process, and it is the responsibility of the entire program to ensure accurate and timely assessment.

Assessment Schedule

Starting with the 2024–2025 academic year, the following information is due on 15 September:

You may also attach supporting files such as raw data, rubrics, or tables. You may label these files with "Supporting"; please see below.

For example:

And so on.

Submission Guidelines

Until recently, we used an online Google Form to accept submissions. However, due to a change in IT policy, we are now asking you to upload your files to a folder designated for your division.

Please title your PDF file in the following format:

Examples:

Please note that starting in Fall 2022, we are keeping assessment documents separate between face-to-face and online versions of programs. Therefore, if your program has both face-to-face and online versions, please submit separate reports for each of the two versions, as the instructions for submission (see above) explain.

You should already have access to your division's folder, but if not, please let tmetcalf@shc.edu know. Here are the links to the folders:

General Guidelines for Program Assessment

Spring Hill College follows a yearly assessment cycle.

If you want general information about the College's students and courses, on which to potentially base assessment plans and analyses, please see here.

For the Current Academic Year: Identify Outcomes and Assessment Plan

1. Identify Outcomes

Identify 2–5 intended outcomes for your academic program.

Base these on your program's mission and the content of its curriculum.

Ideally, outcomes are stated in a way that explicitly mentions by when the outcome will occur and how it will be demonstrated to have occurred. You may wish to state your outcomes in this form:

"By completion of the B.A. in Underwater Basket-Weaving, students will:

Notice that these outcomes specify a measurable outcome, how it will be measured, and by when it will be achieved. Notice that they are not limited to simply reporting knowledge that graduates will have, especially because knowledge is best measured by how it is demonstrated.

You may wish to consult Bloom's Taxonomy for ideas of actions that graduates will take in achieving these outcomes.

If your program's main contact with students is as satisfying Core or Program requirements, rather than as producing Bachelor's students, you may wish to state your outcomes in terms of what students who take courses in your program will achieve.

You may wish to produce a Curriculum Map for your program as well.

Notice that the outcomes' measurement is in terms of coursework. Ideally, you should consider other ways of measuring outcomes, including measures external to SHC courses. For example, you might be able to track students' placement rates in jobs or graduate school, or survey alumni. For more on measurement, proceed to the next step.

Ideally, at this point, cite your previous year's Continuous-Improvement Plan and explain how your unit has begun implementing it.

2. Assessment Plan

Explain in detail how your program will measure whether and the degree to which its graduates or students will achieve those outcomes. If you crafted your outcomes according to the advice above, then this should be straightforward. You can mention specific in-class examinations or projects. Some programs may cite final grades in key or capstone courses; while this can be helpful for internal assessment, such data are not sufficient for accreditation reports to SACSCOC.

As noted in the previous step's description, it's ideal to include a mixture of internal and external measures.

It is also ideal to include a mixture of formative and summative assessments. Formative assessments are applied during the learning processes, typically within a course, such that students can use that feedback to improve their performance and learning during the course. Summative assessments are applied after the learning process is complete, to some instrument or product that the student has produced. For more information, you can consult various sources on the Internet.

Another important strategy is to include both direct and indirect assessments.

Direct assessments are from demonstrations of students' knowledge or skills from samples of student work, especially scores from in-class or external (e.g., licensure or post-graduate) exams, written essays (or lab reports, term papers, discussion posts, case-study analyses), and performances (e.g., artistic performances and products, exhibits, presentations) or capstone experiences (e.g., senior or honors theses, portfolios, or research projects) scored using a rubric.

Indirect assessments are not from direct demonstration, but instead, from evaluation or reflection about the students' skills or knowledge (or direct demonstrations thereof); these might include the overall course grade, student attributes (such as hours spent in class, or class participation-rates), admission rates, placement rates, internships or field evaluations (e.g., from employers or from observations of fieldwork), student surveys, alumni surveys, surveys of others' perceptions of students, exit interviews, self-reports of learning, and awards or scholarships earned.  For more information, there are various useful sources on the Internet.

Ideally, include rubrics to help assessors ascertain the degree to which students meet the outcomes. Investing time now in rubrics will make actual assessment easier, especially when it comes time to write your Analysis—see next step.

As with your Intended Outcomes, ideally, explain how the assessment plan you will be using is compatible with implementation of your Continuous-Improvement Plan.

For the Preceding Academic Year: Analysis and Continuous-Improvement Plan

3. Analysis

At this step, your program's faculty and other stakeholders consider the results delivered by the measurements mentioned in the previous step. If you used rubrics, then it's relatively easy to get started with your analysis. Your students' performance according to the rubrics helps tell you the degree to which your program's students are achieving their outcomes—which, in turn, tells you the degree to which your program is achieving its mission.

Explain whether your program followed its previous year's Continuous-Improvement Plan. If it didn't, explain why not.

Provide a discussion of whether and when your students achieved the outcomes, and to what degree. If they achieved an outcome with high-quality performance, say so, and briefly suggest an explanation or explanations for why they did so. This explanation may include following the previous year's Continuous-Improvement Plan. If they did not achieve some outcomes, provide some informed speculation about why they did not achieve the outcomes. As before, you should try to cite the degree to which your program followed the previous year's Continuous-Improvement Plan, and whether following it had any effect on the degree to which your program's students achieved their outcomes.

Whether your students achieved the outcomes, use your information to begin thinking about how to improve your program, which you will detail in the next step: the Continuous-Improvement Plan.

Please note: During your submission for this step, you should submit file(s) for your raw data. For this submission, please submit a spreadsheet or workbook with the student IDs of the students you're basing your analysis on in Column A and whatever criteria you've rated them on in the other columns, with corresponding values. For example, if you base some of your assessment on standardized tests, include a spreadsheet with your students' IDs in Column A and the corresponding scores in the other columns. If you need assistance in obtaining a list of student IDs for your students, please contact your division chair or the Faculty Director of Institutional Research, Thomas Metcalf, at tmetcalf@shc.edu.

You may also view a general template and sample for such a spreadsheet here (one workbook with two sheets). You can download it as an Excel file and then fill out your version of it for your program(s). Please note that you may need to change the column titles, or add or subtract columns, to match your assessment methods and ensure that you include a column for each part of each assessment instrument.

4. Continuous-Improvement Plan

Now that your program has had time to develop, inspect, and digest its Analysis, your program's faculty and other stakeholders can collaborate to compose a Continuous-Improvement Plan.

If your program is generally achieving its mission (and its students are generally achieving the intended outcomes), include in your plan how you will continue to implement your successful strategies. Even if your students are performing well, think carefully about your program's mission and decide how you can further improve the degree to which the mission is accomplished. You may at least wish to identify a new outcome or modify an existing one, or develop a new rubric or curriculum map.

If your program is generally not achieving its mission, or your students are not generally achieving the intended outcomes, including in your plan what you will change in order to improve their performance, or if the problems are chiefly external (i.e., outside your program's control), explain how you will nevertheless mitigate them, or otherwise better-prepare your students to achieve your program's intended outcomes.

Now is also a point at which you may consider changing your intended outcomes or your assessment plan, which you can document and explain here, and you will implement in the next round of intended outcomes—see Step 1 above.


For More Information

Please consult with your Faculty Director of Institutional Research, Thomas Metcalf, at tmetcalf@shc.edu.

Official Listing of Credential Programs

Please consult the Bulletin's "Summary Listing of Academic Programs." The Bulletin may be found in the Registrar section of Badgerweb.

Program-Assessment Progress

To view your division's progress and completed submissions, please find the corresponding link to your division's subfolders, and inspect the contents of the subfolders.