A world-class institute of changemakers in the deaf and signing community.
Since 1864, we have been investing in and creating resources for deaf and hard of hearing children, their families, and the professionals who work with them.
Over 50 degree programs, with online and continuing education for personal and professional development.
Innovating solutions to break down barriers, and using science to prove what does and doesn’t work.
We make it easy for you to apply and enter here.
Ready to take the next step toward a college education?
Make lasting memories and grow in ways you never thought possible.
Menu
Institutional Effectiveness & Certification
Director: Caroline Kobek Pezzarossi, Ph.DCollege Hall 410
(202) 559-5370 (videophone)
Email
DISCLAIMER: This data in this section is fictitious and does not, in any way, represent any of the programs at Gallaudet University. This information is intended only as examples.
A rubric is a scoring guide used to assess performance against a set of criteria. At a minimum, it is a list of the components you are looking for when you evaluate an assignment. At its most advanced, it is a tool that divides an assignment into its component parts, and provides explicit expectations of acceptable and unacceptable levels of performance for each component.
1 - Checklists, the least complex form of scoring system, are simple lists indicating the presence, NOT the quality, of the elements. Therefore, checklists are NOT frequently used in higher education for program-level assessment. But faculty may find them useful for scoring and giving feedback on minor student assignments or practice/drafts of assignments.
Example 1: Critical Thinking Checklist
The student…
__ Accurately interprets evidence, statements, graphics, questions, etc. __ Identifies the salient arguments (reasons and claims) __ Offers analyzes and evaluates major alternative points of view __ Draws warranted, judicious, non-fallacious conclusions __ Justifies key results and procedures, explains assumptions and reasons __ Fair-mindedly follows where evidence and reasons lead
Example 2: Presentation Checklist
The student…__ engaged audience __ used an academic or consultative ASL register __ used adequate ASL syntactic and semantic features __ cited references adequately in ASL __ stayed within allotted time __ managed PowerPoint presentation technology smoothly
2 - Basic Rating Scales are checklists of criteria that evaluate the quality of elements and include a scoring system. The main drawback with rating scales is that the meaning of the numeric ratings can be vague. Without descriptors for the ratings, the raters must make a judgment based on their perception of the meanings of the terms. For the same presentation, one rater might think a student rated “good” and another rater might feel the same student was "marginal."
3 - Holistic Rating Scales use a short narrative of characteristics to award a single scored based on an overall impression of a student's performance on a task. A drawback to using holistic rating scales is that they do not provide specific areas of strengths and weaknesses and therefore are less useful to help you focus your improvement efforts.
Use a holistic rating scale when the projects to be assessed will vary greatly (e.g., independent study projects submitted in a capstone course) or when the number of assignments to be assessed is significant (e.g., reviewing all the essays from applicants to determine who will need developmental courses).
The Holistic Critical Thinking Scoring Rubric: A Tool for Developing and Evaluating Critical Thinking. Retrieved April 12, 2010 from Insight Assessment.
4 - Analytic Rating Scales are rubrics that include explicit performance expectations for each possible rating, for each criterion. Analytic rating scales are especially appropriate for complex learning tasks with multiple criteria.
Evaluate carefully whether this the most appropriate tool for your assessment needs. They can provide more detailed feedback on student performance; more consistent scoring among raters but the disadvantage is that they can be time-consuming to develop and apply.
Results can be aggregated to provide detailed information on strengths and weaknesses of a program.
Example: Critical Thinking Portion of the Gallaudet University Rubric for Assessing Written English
There are different ways to approach building an analytic rating scale: logical or organic. For both the logical and the organic model, steps 1-3 are the same.
Tip: Adding numbers to the ratings can make scoring easier. However, if you plan to also use the rating scale for course-level assessment grading as well, a meaning must be attached to that score. For example, what is the minimum score that would be considered acceptable for a “C.”
Example:
Components of Analytic Rating Scales
Other possible descriptors include:
examples of inconsistent performance characteristics and suggested corrections.
Tips: Keep list of characteristics manageable by only including critical evaluative components. Extremely long, overly-detailed lists make a rating scale hard to use.
In addition to having descriptions brief, the language should be consistent. Below are several ideas to keep descriptors consistent:
Keep the aspects of a performance stay the same across the levels but adding adjectives or adverbial phrases to show the qualitative difference
A word of warning: numeric references on their own can be misleading. They are best teamed with a qualitative reference (eg three appropriate and relevant examples) to avoid ignoring quality at the expense of quantity.
uses several relevant strategies
uses some relevant strategies
uses few or no relevant strategies
Use rating scales for program-level assessment to see trends in strengths and weaknesses of groups of students.
Examples
For more information on using course-level assessment to provide feedback to students and to determine grades, see University of Hawaii’s “Part 7. Suggestions for Using Rubrics in Courses” and the section on Converting Rubric Scores to Grades in Craig A. Mertler’s “Designing Scoring Rubrics for Your Classroom”.
Adapted from sources below:
Allen, Mary. (January, 2006). Assessment Workshop Material. California State University, Bakersfield. Retrieved DATE from http://www.csub.edu/TLC/options/resources/handouts/AllenWorkshopHandoutJan06.pdf
http://www.uhm.hawaii.edu/assessment/howto/rubrics.htm
http://www.teachervision.fen.com/teaching-methods-and-management/rubrics/4523.html?detoured=1
http://www.assessmentforlearning.edu.au/professional_learning/success_criteria_and_rubrics/success_design_rubrics.html
Mertler, Craig A. (2001). Designing Scoring Rubrics for Your Classroom. Practical Assessment, Research & Evaluation. Retrieved April 7, 2010 from http://pareonline.net/getvn.asp?v=7&n=25
Mueller, Jon. (2001). Rubrics. Authentic Assessment Toolbox. Retrieved April 12, 2010 from http://jonathan.mueller.faculty.noctrl.edu/toolbox/rubrics.htm
http://en.wikipedia.org/wiki/Rubric_(academic)
Tierney, Robin & Marielle Simon. (2004). What's Still Wrong With Rubrics: Focusing on the Consistency of Performance Criteria Across Scale Levels. Practical Assessment, Research & Evaluation, 9(2). Retrieved April 13, 2010 from http://PAREonline.net/getvn.asp?v=9&n=2
You will need Adobe Reader to view these PDF documents. Adobe Acrobat reader is free software that can be downloaded from the Adobe Reader website.
Admissions Requirements
Spring 2021 – Dec 12Fall 2021 – May 15