Instructions: Developing a Scoring Criteria (Rubrics)

DISCLAIMER: This data in this section is fictitious and does not, in any way, represent any of the programs at Gallaudet University. This information is intended only as examples.


Types of Scoring Criteria (Rubrics )

A rubric is a scoring guide used to assess performance against a set of criteria. At a minimum, it is a list of the components you are looking for when you evaluate an assignment. At its most advanced, it is a tool that divides an assignment into its component parts, and provides explicit expectations of acceptable and unacceptable levels of performance for each component.

Types of Rubrics

1 - Checklists, the least complex form of scoring system, are simple lists indicating the presence, NOT the quality, of the elements. Therefore, checklists are NOT frequently used in higher education for program-level assessment. But faculty may find them useful for scoring and giving feedback on minor student assignments or practice/drafts of assignments.

Example 1: Critical Thinking Checklist
The student…
__ Accurately interprets evidence, statements, graphics, questions, etc.
__ Identifies the salient arguments (reasons and claims)
__ Offers analyzes and evaluates major alternative points of view
__ Draws warranted, judicious, non-fallacious conclusions
__ Justifies key results and procedures, explains assumptions and reasons
__ Fair-mindedly follows where evidence and reasons lead

Example 2: Presentation Checklist
The student…
__ engaged audience
__ used an academic or consultative ASL register
__ used adequate ASL syntactic and semantic features
__ cited references adequately in ASL
__ stayed within allotted time
__ managed PowerPoint presentation technology smoothly

back to Top

2 - Basic Rating Scales are checklists of criteria that evaluate the quality of elements and include a scoring system. The main drawback with rating scales is that the meaning of the numeric ratings can be vague. Without descriptors for the ratings, the raters must make a judgment based on their perception of the meanings of the terms. For the same presentation, one rater might think a student rated “good” and another rater might feel the same student was "marginal."

Example: Basic Rating Scale for Critical Thinking

 

Excellent
5

Good
4

Fair
3

Marginal
2

Inadequate
1

Accurately interprets evidence, statements, graphics, questions, etc

 

 

 

 

 

Identifies the salient arguments (reasons and claims)

 

 

 

 

 

Offers analyzes and evaluates major alternative points of view

 

 

 

 

 

Draws warranted, judicious, non-fallacious conclusions

 

 

 

 

 

Justifies key results and procedures, explains assumptions and reasons

 

 

 

 

 

Fair-mindedly follows where evidence and reasons lead

 

 

 

 

 

back to Top 

3 - Holistic Rating Scales use a short narrative of characteristics to award a single scored based on an overall impression of a student's performance on a task. A drawback to using holistic rating scales is that they do not provide specific areas of strengths and weaknesses and therefore are less useful to help you focus your improvement efforts.

Use a holistic rating scale when the projects to be assessed will vary greatly (e.g., independent study projects submitted in a capstone course) or when the number of assignments to be assessed is significant (e.g., reviewing all the essays from applicants to determine who will need developmental courses).

Example: Holistic Rating Scale for Critical Thinking Scoring

  • Peter A. Facione, Noreen C. Facione, and Measured Reasons LLC. (2009), The Holistic Critical Thinking Scoring Rubric: A Tool for Developing and Evaluating Critical Thinking. Retrieved April 12, 2010 from Holistic Critical Thinking Scoring Rubric 

    back to Top   
    4 - Analytic Rating Scales are rubrics that include explicit performance expectations for each possible rating, for each criterion. Analytic rating scales are especially appropriate for complex learning tasks     with multiple criteria.

    Evaluate carefully whether this the most appropriate tool for your assessment needs. They can provide more detailed feedback on student performance; more consistent scoring among raters but the disadvantage is that they can be time-consuming to develop and apply.


    Results can be aggregated to provide detailed information on strengths and weaknesses of a program.


    Example:
    Critical Thinking Portion of the Gallaudet University Rubric for Assessing Written English
  • Pre-College Skills
    1

    Emerging Skills
    2

    Developing Skills
    3

    Mastering Skills
    4

    Exemplary Skills
    5

    IDEAS and CRITICAL THINKING

    1. Assignment lacks a central point.

    2. Displays central point, although not clearly developed.

    3. Displays adequately-developed central point.

    4, Displays clear, well-developed central point.

    5. Central point is uniquely displayed and developed.

    1. Displays no real development of ideas.

    2. Develops ideas superficially or inconsistently.

    3. Develops ideas with some consistency and depth.

    4. Displays insight and thorough development of ideas.

    5. Ideas are uniquely developed.

    1. Lacks convincing support for ideas.

    2. Provides weak support for main ideas.

    3. Develops adequate support for main ideas.

    4. Develops consistently strong support for main ideas.

    5. Support for main ideas is uniquely accomplished.

    1. Includes no analysis, synthesis, interpretation, and/or other critical manipulation of ideas.

    2. Includes little analysis, synthesis, interpretation, and/or other critical manipulation of ideas.

    3. Includes analysis, synthesis, interpretation and/or other critical manipulation of ideas in most parts of the assignment.

    4. Includes analysis, synthesis, interpretation, and/or other critical manipulation of ideas, throughout.

    5. Includes analysis, synthesis, interpretation, and/or other critical manipulation of ideas, throughout— leading to an overall sense that the piece could withstand critical analysis by experts in the discipline.

    1. Demonstrates no real integration of ideas (the author’s or the ideas of others) to make meaning.

    2. Begins to integrate ideas (the author’s or the ideas of others) to make meaning.

    3. Displays some skill at integrating ideas (the author’s or the ideas of others) to make meaning.

    4. Is adept at integrating ideas (the author’s or the ideas of others) to make meaning.

    5. Integration of ideas (the author’s or the ideas of others) is accomplished in novel ways.

    back to Top

    ____________________________________________________

    Steps for Creating an Analytic Rating Scale (Rubric) from Scratch

    There are different ways to approach building an analytic rating scale: logical or organic. For both the logical and the organic model, steps 1-3 are the same.

    Steps 1 – 3: Logical AND Organic Method

    Determine the Best Tool

    1. Identify what is being assessed, (e.g., ability to apply theory) as this is focused on program-level learning assessment.

      Determine first whether an analytic rating scale is the most appropriate way of scoring the performance and/or product.
      An analytic rating scale is probably a good choice
      a. if there are multiple aspects of the product or process to be considered
      b. if a basic rating scale or holistic rating scale cannot provide the breadth of assessment you need.

    Building the Shell

    The Rows

    1. Identify what is being assessed. (e.g., ability to apply theory). 
      • Specify the skills, knowledge, and/or behaviors that you will be looking for.
      • Limit the characteristics to those that are most important to the assessment.
    Examples:

     


    The Columns

    1. Develop a rating scale with the levels of mastery that is meaningful.

      Tip: Adding numbers to the ratings can make scoring easier. However, if you plan to also use the rating scale for course-level assessment grading as well, a meaning must be attached to that score. For example, what is the minimum score that would be considered acceptable for a “C.”

    Example:  

     

    Other possible descriptors include:
    * Exemplary, Proficient, Marginal, Unacceptable
    * Advanced, High, Intermediate, Novice
    * Beginning, Developing, Accomplished, Exemplary
    * Outstanding, Good, Satisfactory, Unsatisfactory

     

    Step 4:

    Writing the Performance Descriptors in the Cells

    The descriptors are the critical piece of an analytic rating scale. To produce useful, valid scores, attributes in your descriptors must be consistent across the ratings and easy to read. See examples of inconsistent performance characteristics and suggested corrections.

    1. Use either the logical or the organic method to write the descriptions for each criterion at each level of mastery.

    Logical Method

    Organic Method

    For each criterion, at each rating level, brainstorm a list of the performance characteristics*. Each should be mutually exclusive. Have experts sort sample assignments into piles labeled by ratings (e.g., Outstanding, Good, Satisfactory, Unsatisfactory)
        Based on the documents in the piles, determine the performance characteristics* that distinguish the assignments

    Tips: Keep list of characteristics manageable by only including critical evaluative components. Extremely long, overly-detailed lists make a rating scale hard to use.

    In addition to having descriptions brief, the language should be consistent. Below are several ideas to keep descriptors consistent:

    • Refer to specific aspects of the performance for each level

      3

      2

      1

      analyses the effect of …

      describes the effects of …

      lists the effects of …

    • Keep the aspects of a performance stay the same across the levels but adding adjectives or adverbial phrases to show the qualitative difference

      3

      2

      1

      provides a complex explanation

      provides a detailed explanation

      provides a limited explanation

      shows a comprehensive knowledge

      shows a sound knowledge

      shows a basic knowledge

    • Refer to the degree of assistance needed by the student to complete the task

      3

      2

      1

      uses correctly and independently

      uses with occasional peer or teacher assistance

      uses only with teacher guidance

    • Use numeric references to show quantitative differences among levels
      A word of warning: numeric references on their own can be misleading. They are best teamed with a qualitative reference (eg three appropriate and relevant examples) to avoid ignoring quality at the expense of quantity.


      3

      2

      1

      provides three appropriate  examples

      provides two appropriate  examples

      provides an appropriate  example

      uses several relevant strategies

      uses some relevant strategies

      uses few or no relevant strategies

    Steps 5-6: Logical AND Organic Methods

    1. Test the rating scale before making it official. Have a norming* session.
      Ask colleagues who were not involved in the rating scale’s development to apply it to some products or behaviors and revise as needed to eliminate ambiguities, confusion, and/or inconsistencies. You might also let students self-assess using the rating scale.


      *See University of Hawaii’s
      Part 6. Scoring Rubric Group Orientation and Calibration” for directions for this process.
    2. Review and revise.

    back to Top 

    Steps for Adapting an Existing Analytic Rating Scale (Rubric)

     

    1. Evaluate the rating scale. Ask yourself:
    • Does the rating scale relate to all or most the outcome(s) I need to assess?
    • Does it address anything extraneous?
    1. Adjust the rating scale to suit your specific needs.
    • Add missing criteria
    • Delete extraneous criteria
    • Adapt the rating scale
    • Edit the performance descriptors
    1. Test the rating scale.
    2. Review and revise again, if necessary.

    back to Top

    Uses of Rating Scales (Rubrics)

    Use rating scales for program-level assessment to see trends in strengths and weaknesses of groups of students.

      Examples
    • To evaluate a holistic project (e.g., theses, exhibitions, research project) in capstone course that pulls together all that students have learned in the program.
    • Supervisors might use a rating scale developed by the program to evaluate the field experience of students and provide the feedback to both the student and the program.
    • Aggregate the scores of rating scale used to evaluate a course-level assignment. For example, the Biology department decides to develop a rating scale to evaluate students' reports from 300- and 400-level sections. The professors use the scores to help determine the students’ grades and provide students with feedback for improvement. The scores are also given to the department’s Assessment Coordinator to summarize to determine how well they are meeting their student learning outcome, "Make appropriate inferences and deductions from biological information."

    For more information on using course-level assessment to provide feedback to students and to determine grades, see University of Hawaii’s Part 7. Suggestions for Using Rubrics in Courses” and the section on Converting Rubric Scores to Grades in Craig A. Mertler’s  “Designing Scoring Rubrics for Your Classroom”.

    back to Top

    Sample Rating Scales (Rubrics)

    back to Top 

    Resources

    Adapted from sources below:

    Allen, Mary. (January, 2006). Assessment Workshop Material. California State University, Bakersfield. Retrieved DATE from http://www.csub.edu/TLC/options/resources/handouts/AllenWorkshopHandoutJan06.pdf  

    Creating and Using Rubrics. (March, 2008). University of Hawai’i at Manoa. Retrieved April 5, 2010  from http://www.uhm.hawaii.edu/assessment/howto/rubrics.htm

    Creating an Original Rubric. Teaching Methods and Management, TeacherVision. Retrieved April 7, 2010 from http://www.teachervision.fen.com/teaching-methods-and-management/rubrics/4523.html?detoured=1

    Danielson, Cherry and Naser, Curtis. (November 7, 2009). Developing Effective Rubrics: A New Tool in Your Assessment Toolbox. Workshop at Annual NEAIR Conference.

    How to Design Rubrics. Assessment for Learning Curriculum Corporation. Retrieved April 7, 2010 from http://www.assessmentforlearning.edu.au/professional_learning/success_criteria_and_rubrics/success_design_rubrics.html

    Mertler, Craig A. (2001). Designing Scoring Rubrics for Your Classroom. Practical Assessment, Research & Evaluation. Retrieved April 7, 2010 from http://pareonline.net/getvn.asp?v=7&n=25

    Mueller, Jon. (2001). Rubrics. Authentic Assessment Toolbox. Retrieved April 12, 2010 from http://jonathan.mueller.faculty.noctrl.edu/toolbox/rubrics.htm  

    Rubric (academic). (2010, March 3). In Wikipedia, the free encyclopedia. Retrieved April 70, 2010, from http://en.wikipedia.org/wiki/Rubric_(academic)  

    Tierney, Robin & Marielle Simon. (2004). What's Still Wrong With Rubrics: Focusing on the Consistency of Performance Criteria Across Scale Levels. Practical Assessment, Research & Evaluation, 9(2). Retrieved April 13, 2010 from http://PAREonline.net/getvn.asp?v=9&n=2  

    back to Top


     

    About Gallaudet
    2014 Presidential Search
    Administration
    Campus Photos
    Clerc Center
    Contact Us
    Employment Opportunities
    Fast Facts
    Make an Online Gift
    Maps & Directions
    Museum
    University Communications
    Visitors Center
    Admissions
    English Language Institute
    Financial Aid
    Graduate Admissions
    Graduate Orientation
    International Admissions
    Professional Studies
    Test Center
    Undergraduate Admissions
    Academics & Research
    Archives/Deaf Collections
    Career Center
    Catalog & Course Info
    General Studies
    Graduate Programs
    Honors Program
    Library
    Professional Studies
    Registrar's Office
    Research Support and International Affairs
    Undergraduate Majors
    VL2
    Campus Life
    Athletics
    Bison Shop (bookstore)
    Campus Activities
    Commencement
    Food Services
    Intramurals
    Public Safety
    Residence Life and Housing
    Washington, D.C.
    Tools & Resources
    Bison
    Campus Directory
    Daily Digest
    Help Desk
    Gallaudet Alert-subscribe
    GU Press
    Kellogg Conference Hotel
    Maps & Directions
    my.Gallaudet
    People @ Gallaudet
    Shuttle Bus
    Gallaudet University | 800 Florida Avenue NE, Washington, DC 20002
    Copyright © 2014 Gallaudet University
    FacebookTwitterYouTube