DISCLAIMER: This data in this section is fictitious and does not, in any way, represent any of the programs at Gallaudet University. This information is intended only as examples.

Using Assessment Results

This step involves making recommendations using your analysis of the data to make program changes that will improve student learning.


Recommendations: actions taken/to be taken to improve student learning that area clearly supported by the data — what will be done, who will do it, how it will be assessed it, and by when.

*Once the data are analyzed the unit should be able to see whether it has achieved its intended outcome.

  • Where the criterion is met or surpassed, the unit may rightly conclude that no change is needed and report, “No action required.” If, when the same outcome is assessed the next year, the results are repeated and the staff can insure the criterion was met, the unit should consider assessing a different outcome in the following cycle.
  • In the case where the results indicate the criterion level was not met, the unit needs to evaluate its results further to determine what needs to be done to improve the likelihood of achieving the outcome.
  • The unit makes an action plan that includes what will be done, who will do it, and by when. This step is what makes the difference between “assessment” and “busy work.”

**AREAS to look at when assessment results are disappointing…


  • Are goals inappropriate or overly ambitious?
  • Do they need to be clarified?
  • Do you have too many?


  • Does curriculum adequately address each SLO?


  • Are you teaching in the way students learn best?


  • Are they poorly written and misinterpreted?
  • Do they match your SLOs?
  • Are they too difficult for most responsible students?


  • Is poor performance really the student’s fault?


Develop recommendations to improve student learning outcomes based on your data analysis which identified the strengths and weaknesses of your program/unit. You should not only you create a plan to improve on your weaknesses, but to build on strengths to make them better. (Remember to build into the plan the periodic re-assessing of your strengths to make sure you’re not slipping.)


***Results of the pre-test have documented conclusively that students entering the class are far from "knowing it all" - in fact, the scores are typically below 50% accurate. These pre-test data document the great need for the Library 101 course, despite claims of some students, and form the foundation for subsequent student learning throughout a student’s academic career. Even though the final exam shows a dramatic increase in student learning, several items are still require improvement: Item 2 , an achievement rate of just 32.8% is not adequate. Course administrators will investigate why more students are not learning or retaining this specific item; Item 3, there is a great positive jump in student learning outcomes seen in the final exam percentage correct, but again a success rate of only 56.1% is not adequate. This item will be addressed by course administrators, in the effort to increase the overall percentage of student learning and retention on this item.

*Assessing the Effectiveness of Non-Instructional Support Offices

**Adapted from:  Suskie, L. (December, 2008).  Understanding and Using Assessment Results. Paper presented at 2008 Middle States Commission on Higher Education Annual Conference.

***Example adapted from:  Iowa State University. Library 160: Measurement of Outcomes and Results.

You will need Adobe Reader to view these PDF documents. Adobe Acrobat reader is free software that can be downloaded from the Adobe Reader website.