RUBRICS:
THE VERSATILE AND
PRACTICAL CHOICE
Assessment Day 4.0
February 2016
Eric Streeter, Senior Academic Advisor
Outline
• What are rubrics
• Why we chose to use rubrics
• Creating the rubrics
• Experience using rubrics in academic advising
• Data
WHAT ARE RUBRICS?
Rubrics
• Used to score or rate
• Articulates what is being measured and the scale it is
being measured on
• Descriptive rubrics: the gold standard
• The scales is further explained for each item being measured
A Descriptive Rubric Example
Needs
Improvement
(1)
Developing
(2)
Sufficient (3)
Above
Average (4)
Clarity
(Thesis
supported by
relevant
information
and ideas.)
The purpose
of the student
work is not
well-defined.
Central ideas
are not
focused to
support the
thesis.
Thoughts
appear
disconnected.
The central
purpose of the
student work
is identified.
Ideas are
generally
focused in a
way that
supports the
thesis.
The central
purpose of the
student work
is clear and
ideas are
almost always
focused in a
way that
supports the
thesis.
Relevant
details
illustrate the
author’s
ideas.
The central
purpose of the
student work
is clear and
supporting
ideas always
are always
well-focused.
Details are
relevant,
enrich the
work.
Your Experience with Rubrics
• Academic assignments
• Judged contests (talent show, costume contest, etc.)
• Performance evaluations
• Candidate evaluations
• To an extent – some survey questions resemble rubrics
WHY USE RUBRICS?
The Choice: Rooted in Mapping
• Direct measure for learning outcomes
• Context for learning for measuring: mostly advising
appointments
• Ease of use
• Versatility
• Conversations in advisement appointments
• “Assignments” completed by advisees
• Timeframes for use
Rubric Versatility
Advising Structure:
• College of Arts and Sciences
• School of Architecture and
Planning
• School of Engineering and
Applied Sciences
• School of Management
• School of Medicine and
Biomedical Sciences
• School of Nursing
• School of Pharmacy and
Pharmaceutical Sciences
• School of Public Health and
Health Professions
• Student Advising Services
• Athletics
• Access to College Excellence
Program (ACE)
• Daniel Acker Scholars
• Educational Opportunity
Program (EOP)
• Student Support Services
(SSS)
• University Honors College
CREATING OUR RUBRICS
Rubric Tool
Baseline, form Campus
Labs
• Was already available
to us
• Easy to access
• Easy to set up
Development & Testing
• Developing the criteria and descriptions
• Decision to use a 3 point scale
• Testing
• Several advisors form different departments
• Multiple contexts
• Increased support for the tool
ADVISING ASSESSMENT
RUBRICS
Major and General Education Requirements Rubric
Goals Rubric
Decisions and Tradeoffs
Choice
• Flexibility for sampling
method
• Timeline
• Sample size
Tradeoffs
• Challenges to viability of
the data
Decisions and Tradeoffs
Choice
• Baseline tool
• Availability
• Ease of use
• Ease of set up
Tradeoffs
• Lack of flexibility to make
certain customizations
• Comment sections
• Criteria descriptions
• Built in reports do not say
much
• Advisors cannot view the
data they entered
Decisions and Tradeoffs
Choice
• Entering student person
number
• Demographic data
• Makes using the tool faster
when other demographic
data is not entered
Tradeoffs
• Some data gets lost when
ID numbers get entered
incorrectly
Decisions and Tradeoffs
Choice
• 3 point scale
• Easy to use
• Required less instruction
Tradeoffs
• Less dynamic data
Challenges
• Training
• People do not understand what a rubric is at first
• When to do the assessment
• Limitations of the Baseline tool
• Challenges to the data based on the tradeoffs
Successes
• Positive feedback about ease of use
• Every unit agreed to use at least one rubric
• Data!
DATA
LO: Students will know the
requirements for their major
• Requirements Rubric
LO: Students will know the
requirements for their major
• Requirements Rubric
LO: Students will know the
requirements for their major
• Requirements Rubric
LO: Students will know the
requirements for their major
• Requirements Rubric
LO: Students will know their
requirements for general education
• Requirements Rubric
LO: Students will know their
requirements for general education
• Requirements Rubric
LO: Students will be able to select
appropriate courses based on their
goals
• Goals Rubric
LO: Students will be able to select
appropriate courses based on their
goals
• Goals Rubric
LO: Students will be able to select
appropriate courses based on their
goals
• Goals Rubric
USING THE DATA
“Closing the Loop”
• Share data
• Engage stakeholders
• Make interpretations
• Set targets for improvement
• Brainstorm ideas for improvement
• Implement changes for improvement
• Continue to assess after changes are implimented

Rubrics- the Versatile and Practical Choice February 2016

  • 1.
    RUBRICS: THE VERSATILE AND PRACTICALCHOICE Assessment Day 4.0 February 2016 Eric Streeter, Senior Academic Advisor
  • 2.
    Outline • What arerubrics • Why we chose to use rubrics • Creating the rubrics • Experience using rubrics in academic advising • Data
  • 3.
  • 4.
    Rubrics • Used toscore or rate • Articulates what is being measured and the scale it is being measured on • Descriptive rubrics: the gold standard • The scales is further explained for each item being measured
  • 5.
    A Descriptive RubricExample Needs Improvement (1) Developing (2) Sufficient (3) Above Average (4) Clarity (Thesis supported by relevant information and ideas.) The purpose of the student work is not well-defined. Central ideas are not focused to support the thesis. Thoughts appear disconnected. The central purpose of the student work is identified. Ideas are generally focused in a way that supports the thesis. The central purpose of the student work is clear and ideas are almost always focused in a way that supports the thesis. Relevant details illustrate the author’s ideas. The central purpose of the student work is clear and supporting ideas always are always well-focused. Details are relevant, enrich the work.
  • 6.
    Your Experience withRubrics • Academic assignments • Judged contests (talent show, costume contest, etc.) • Performance evaluations • Candidate evaluations • To an extent – some survey questions resemble rubrics
  • 7.
  • 8.
    The Choice: Rootedin Mapping • Direct measure for learning outcomes • Context for learning for measuring: mostly advising appointments • Ease of use • Versatility • Conversations in advisement appointments • “Assignments” completed by advisees • Timeframes for use
  • 9.
    Rubric Versatility Advising Structure: •College of Arts and Sciences • School of Architecture and Planning • School of Engineering and Applied Sciences • School of Management • School of Medicine and Biomedical Sciences • School of Nursing • School of Pharmacy and Pharmaceutical Sciences • School of Public Health and Health Professions • Student Advising Services • Athletics • Access to College Excellence Program (ACE) • Daniel Acker Scholars • Educational Opportunity Program (EOP) • Student Support Services (SSS) • University Honors College
  • 10.
  • 11.
    Rubric Tool Baseline, formCampus Labs • Was already available to us • Easy to access • Easy to set up
  • 12.
    Development & Testing •Developing the criteria and descriptions • Decision to use a 3 point scale • Testing • Several advisors form different departments • Multiple contexts • Increased support for the tool
  • 13.
    ADVISING ASSESSMENT RUBRICS Major andGeneral Education Requirements Rubric Goals Rubric
  • 14.
    Decisions and Tradeoffs Choice •Flexibility for sampling method • Timeline • Sample size Tradeoffs • Challenges to viability of the data
  • 15.
    Decisions and Tradeoffs Choice •Baseline tool • Availability • Ease of use • Ease of set up Tradeoffs • Lack of flexibility to make certain customizations • Comment sections • Criteria descriptions • Built in reports do not say much • Advisors cannot view the data they entered
  • 16.
    Decisions and Tradeoffs Choice •Entering student person number • Demographic data • Makes using the tool faster when other demographic data is not entered Tradeoffs • Some data gets lost when ID numbers get entered incorrectly
  • 17.
    Decisions and Tradeoffs Choice •3 point scale • Easy to use • Required less instruction Tradeoffs • Less dynamic data
  • 18.
    Challenges • Training • Peopledo not understand what a rubric is at first • When to do the assessment • Limitations of the Baseline tool • Challenges to the data based on the tradeoffs
  • 19.
    Successes • Positive feedbackabout ease of use • Every unit agreed to use at least one rubric • Data!
  • 20.
  • 21.
    LO: Students willknow the requirements for their major • Requirements Rubric
  • 22.
    LO: Students willknow the requirements for their major • Requirements Rubric
  • 23.
    LO: Students willknow the requirements for their major • Requirements Rubric
  • 24.
    LO: Students willknow the requirements for their major • Requirements Rubric
  • 25.
    LO: Students willknow their requirements for general education • Requirements Rubric
  • 26.
    LO: Students willknow their requirements for general education • Requirements Rubric
  • 27.
    LO: Students willbe able to select appropriate courses based on their goals • Goals Rubric
  • 28.
    LO: Students willbe able to select appropriate courses based on their goals • Goals Rubric
  • 29.
    LO: Students willbe able to select appropriate courses based on their goals • Goals Rubric
  • 30.
  • 31.
    “Closing the Loop” •Share data • Engage stakeholders • Make interpretations • Set targets for improvement • Brainstorm ideas for improvement • Implement changes for improvement • Continue to assess after changes are implimented

Editor's Notes

  • #14 https://www.studentvoice.com/p/Project.aspx?q=006244b9a34bc14cb9d4a99e407278a32649fe42877e8d7d1fdadd7f105ac5081e01f3a389172d057c7e9e741338f9b7ce6d792b66655fbe15d85a0661098997&r=8c33a0bd-01b9-4c76-b473-f92dd1690246 https://www.studentvoice.com/p/Project.aspx?q=d94e860325b81365419f0f83f645d29ba4e5c4821e866128b558afe54d0bab0ba3a7c8d59366f6a7eb95b3a09ffd2d9cb2894c1c853d6f911aa5258d9c65499d&r=b0e5f047-2b1b-4ead-bb49-9b6437c6a631