TE
FRC HVA V. taco”
RU “CbeDASSESSMENT OF LEARNING OUTCOMES,
(Assessment 1)
SECOND EDITION
Copyright, 2012 by
ROSITA L. NAVARRO, Ph.D.
ROSITA G. SANTCS, Ph.D.
and
LORIMAR PUBLISHING, INC.
ISBN 971-685-748-1
Published by
LORIMAR PUBLISHING, INC.
776 Aurora Bivd., cor. Boston Stivet, Cubao, Quezon City, Metro Manila
‘Tel. Nos. 721-2715 + 723-1560 + 727-3386 Cellphone No, 0918-5375190
Telefax (632) 727-3386
Cover and book design by Ronnie Llena Martinez
All rights reserved. No part of this book may be reproduced-or transmitted in
any form or by any means, electronic or mechanical, including photocopying,
recording, mimeographing, or by any information and retrieval system, without
‘written permission from the copyright holder.
Printed in the Philippines by
ADRIANA PRINTING CO., INC
776 Aurora Boulevard, cor, Boston St., Cubao, Quezon City
Tel. No, 722-5175
To purchase additona! copies of this Worktext call LORIMAR PUBLISHING, INC.
Tel, Nos. 721-2715 + 723-1560 + 727-3386 » CP No, 0918-8375190
Teleax (632) 727-3306. Ask fr Jule or send e-mail to
[email protected]rr
Foreword
Setting new benchmarks for teacher education is no mean
‘task. This is so in the instance of Project WRITE (Writing Resources
for Innovative in Teacher Education) which we initiated on
August 4, 2006 culminating in the production and Publication of
workbooks and textbooks,
It was a yeoman’s task to gather our colleagues from Luzon,
Visayas, and Mindanao and motivated them to get involved in
a textbook writing process, The best happened such that we
were able to form a prolific partnership as proven by this WRITE
publication.
This second edition of Assessment of Learning Outcomes
(Assessment 1) is aligned with the National Competency-Based
Teacher Standards (NCBTS), the standards of good teaching in the
Philippines consisting of seven (7) integrated domains, twenty-
‘one (21) strands and eighty (80) performance indicators with the
end in view of contributing to the formation of competent and
committed professional teachers, It is focused on the development
of the 21st century teacher who can respond to the demands of
the K to 12 Basic Education Curriculum which the Department
of Education began implementing this SY 2012-2013. It is dotted
with activities and exercises which are in keeping with current
trends in education such as outcomes-hased, reflective, integrative,
interactive, inquiry-based, brain-based and research-based teaching,
constructivism, multiple intelligences, multicultural education, and
authentic assessment. Likewise, this edition updates the readers
on educational legislations including the universalization of
Kindergarven
We gratefully acknowledge the authors who are pillars
of teacher education in the country - Dr. Rosita L. Navarro
and Dr. Rosita G. Santos, for sharing their expertise in teacher
education.
This Project WRITE publication is one of the latest editions
of the more than 30 publications that are in active circulation
throughout the country. We look forward to more publications
that will help improve teacher performance and advance the
enhancement of both the pre-service and in-service phases of
teacher education.
Onward to Project WRITE!PREFACE
The shift of educational focus from subjectcourse content
to student learning outcomes marks the serious effort to implement
Outcomnes-Based Education (OBE) which is the current national and
international thrust of education at all levels. Without the appropriate
assessment measures, however, the aims and ideals of OBE may not
be strategically achieved,
The book
implements and complements OBE by clarifying the concept of
Outcomes-Based Education (OBE), identifying and explaining student
learning outcomes at the different levels of schooling, and illustrating
A variety of assessment tools to determine if the desired learning
outcomes have been achieved.
Every chapter of the book includes models and examples to
aid students’ understanding. To enhance their capability to apply the
concept learned. and relevant exercises are provided at the end of
the chapter.
‘The authors effectively blended measurement and evaluation
in the discussion of OBF and learning outcomes-based assessment in
order to complete the teaching-learning cycle, thus making the book
an invaluable guide not only for teacher education students but also
for practicing professional teachers.
BRENDA CORPUZ, Ph.D.
President
Philippine Association for Teacher Education (PAFTE)Table of Contents
FOREWORD
PREFACE
(CiarreR 1. Shift of Educational Focus from Content to Learning Outcomes
7 71:1. Outcomes-Based Education: Matching Intentions
with Accomplishments
2 1.2, The Outcomes of Education
v7 +13, Sample Educational Objectives and Learning
Outcomes in Araling Panlipunan (K to 12)
14, Exercises
Charren 2. Measurement, Assessment and Evaluation in
Outcomes-Based Education
4 2.1, Measurement
© +22. Assessment
¢- 23. Evaluation
2.4, Exercises
Curren 3 The Outcomes of Student Leaming
7 Al. Program Objectives and Student Learning Outcomes
3.2. The Three Types of Leaming
& ~3.3, Domain I: Cognitive (Knowledge)
4 34, Domain Il: Psychomotor (Skills)
70 735. Domain II: Affective (Attitude)
3.6. Exercises
Curren 4 Assessing Student Learning Outcomes
77 74.1 Principles of Good Practice in Assessing Leaming Outcomes
/e 742. Samples of Supporting Student Activities
/o- 43. The Outcomes Assessmeni Phases in the Instructional Cycle
4 244, Variety of Assessment Instrument
© Objective Examinations
Essay Examinations
Written Work
‘© Portfolio “ssessment
‘© Assessment Rubrics
Holistic Rubric
Dimensional /Analytical Rubric
# Competencies/Skils Assessment from Biginner
to Proficiency Level
/& 7 45, Assessment of Learning Outcomes in the K to 12 Program
4.6. Exercises
Page
10
u
1s
18
18
19
2
a
a
28
30
31
32
3B
3
3
33
3B
ateer
Charrer $ Development of Varied Assessment Tools:
Knowledge and Reasoning.
5.1. Types of Objective Tests
/e/1 75.2. Planning @ Test and Constructing a Table of Specifications (TOS)
A/? 753, Consivcting a Teue-False Test
22! 75.4, Multiple Choice Tests
«23/755, Matching Type and Supply Type Items
24807 5.6, ESSAYS
5.7. Exercises
Exercises |
Exexcists IL
‘Carre 6 Item Analysis and Validation
26,007 6.1. Hem Analysis
427 62, Validation
2s) 7 63. Reliability —-
6.4, Exencises
Cuarren 7. Performance-Based Test”
88,5398 7.1, Introduction
AVIS 35.1 7.2, Perforinance-Based Tests
26.97 id af 7.3, Performance Tasks
4¢89 ag/ 74, Rubrics and Exemplars
ged! be 775, Creating Rubrics
fe f2 227 16 Tipson Designing Rubres
(V1 #8 B./ 7.7. Ausorating Performance-Based Tests
7.8. Exercises
‘Chuwrter 8. Grading Systems
.47 ga 81. Norm-Referenced Grading
#49 de 782, Criterion-Referenced Grading
70.61 x83. Four Questions in Grading
£4,55 42 / 84, Whai Should Go Into a Student's Grade
LE ga. 85. Standardized Test Scoring
G57 dr 86. Cumulative and Averaging Systems of Grading
ge/ 87. K to 12 Grading System:
Reprodiced from DepEd Order No. 31, 2012
£b~ 88. Altemative Grading System = BS toa
89. Exercises
Rerenences,
Arpepix A
eee
31
SSSRSSSSRRRAARRA BESS
es
ses
100
105
106
107
109stains ne nn eeerensnnenssnemesnetetiahaet
SHIFT OF EDUCATIONAL
- FOCUS FROM CONTENT TO
LEARNING OUTCOMES
Reduced to the barest components, the educative process
happeas between the teacher and the student. Education
: Ironically, however, for centuries we succeeded in
Perpetuating the belief that education is a “pouring in” process
wherein the
. It followed that the
We were
‘used to regarding education basically in terms of designating
a set of subjects to take and when the we
Pronounce the students “ assuming that the instruction
and activitics we provided will lead to the desired knowledge,
skills and other attributes that we think the course passers would
possess
‘The advent’of technology caused a change of perspective
in education, nationally and internationally. ‘Thé teacher ceased
1o be the sole source of knowledge. With knowledge explosion,
students are surrounded with various sources of facts and
information accessible through user- friendly technology. The
who assists in the
organization, interpretation and validation of acquired facts and
information.1
L
with Aceomplis,
| ee ich has three (3) aie
es-Base
udons
went
It is student centered; that is, it places the students at the
center of the process by focusing on Student Learning
Outcomes (SLO),
It is faculty driven; that is, it encourages faculty responsibility
for teaching, assessing program cutcomes and motivating
participation from the students,
It is meaningful; that is, it provides data to guide the teacher
in making valid end continuing improvement in instruction
and assessment activities.
To implement outcomes-based education on the subject or course
level, the following procedure is recommended:
Identification of the educational objectives of the subject!
course, Educational objectives are the broad goals that the
subject/course expects to achieve, and defining in general
terms the knowledge, skills and attitude that the teacher will
help the students to attain. The objectives are stated from the
point of view of the teacher such as: “to develop, to provide,
to enhance, to inculeate, etc.”
sourse objective, Since sajeueoue objectives are broadly
stated, they do not provide detailed guide to be teachable and
measureable. Leaning outcomes are stated as concrete active
verbs such as: to demonstrate, to explain, to differentiate, to
illustrate, ete. A good source of leaning outcomes statements
is the taxonomy of educational objectives by Benjamin Bloom.
f educational objectives is grouped into
three (3):
+ Cognitive, also called knowledge, refers to mental skills
‘such as remembering, understanding, applying, analyzing,
evaluating, synthesizing/creating.
‘+ PS¥chomotor, also referred to as skills, includes manual or
Physical skills, which proceed from mental activities and(haplr1~ShitofEdvcaonal Focus rom Conta! Leaning Outcomes
range from the simplest to the complex such as observing,
imitating, practising, adapting and innovating,
+ Affective, also known as attitude, refers to growth in
feelings 01 emotions from the simplest behavior to the
most complex such as receiving, responding, valuing,
organizing, and internalizing.
3. Drafting outcomes assessment procedure, This procedure
will enable the teacher to determine the degree to which
the students are attaining the desired learning outcomes, It
identifies for every outcome the data that will be gathered
which will guide the selection of the assessment tools to be
used and at what point assessment will be done.
1.2. The Outcomes of Education
‘Outcomes-based education focuses classroom instruction on the
skills and competencies that students must demonstrate when they
exit. There ure two (2) types of outcomes: immediate and deferred
outcomes.
Sees ccompetencies/skills acquired .ipon completion
of a subject, a le level, a segment of the program, or of the
program itself.
Examples:
‘© Ability to communicate in writing and speaking
‘Mathematical problem-solving skill
Skill in identifying objects by using the different senses
Ability to produce artistic or literary works
Ability to do research and write the results
Ability to present an investigative science project
Skill in story-telling
Promotion to a higher grade level
Graduation from a program
Passing a required licensure examination
Initial job placement
refer to the ability to apply cognitive,
Psychemotor and affective skills/competencies in various situations
of a subject; grade level or degree
rogram.
Examples:
Success in professional practice or occupation
* Promotion in a jobAssessnent oF Lavan Outcomes (Asessuent 1)
Awards and recognition
‘* Success in career planning, health and wellness
1.3. Sample Educational Objectives and Learning
Outcomes in Araling Panlipunan (K to 12)
Educational Objectives
Learning Outcomes
1. Pagbibigay sa mga mag-
saral ng kaalaman at
pang-unawa tungkol sa
tao, kapaligiran at lipunan
1.1.Nailalarawan ang sariling
buhay simula sa pagsilang
hanggang sa kasalukuyang
edad.
1.2.Nasasabi at naipapaliwanag
ang mga alituntunin sa
silid-aralan at sa paaralan,
1.3.Naiisa-isa ang mga
tungkulin ng isang
‘mabuting mamamayan
‘sa pangangalaga ng
apaligiran
2. Paglinang ng kakayahan
nna magsagawa ng
proyektong pangtahanan
2 pampamayarian
(Payhametr objective
2.1 Nakakasulat ng sanaysay
nna naglalarawan ng mga
taong bumubuo ng sariling
- pamilya,
2.2.Nakapagsasagawa ng
panayam ng ilang
mahahalagang pinuno
ng sariling barangay at
naisusulat ang mga nakalap
na kaalaman,
3. Pagganyak sa mga
‘mageaaral upang
maipamalas ang malalim
na pagpapahalaga sa
ela (Affective
3.1. Nakasusulat ng tula,
awit o maikling kuwento
tungkol sa kahalagahan ng
kapaligiran.
3.2. Nakagagawa ng “video
presentation” tungkol sa
wastong pag-aalaga ng
kapaligiran.(Captr | ~ Sit of EdvaonalFous am Contr eaing Ooms
oe
1.4. Exercises
A. The following staternents are incorrect. On the blank before
each number, write the letter of the section which makes
the statement wrong, and on the blank after each number,
re-write the wrong section to make the statement correct.
Because of knowledge explanation/ brought abut by
a) ©)
the use/of comme 2 education/the tac ceased
)
to be the sole source of knowledge.
—— 2 At present, / the teacher i th giver of knowledge!
"
by assisting/ in the organization of facts and
© @
information.
of focus/ in ingtruction, from outcomes
) ©
to content/ is known as Guisonsed- Based Education
(OBE).
___ 4. A good source /of subject matter statement/ is
Oo) ee
Benjamin Bloom's/ Taxonomy of Educational
tb
Objectives.
—— 5. Education comes/ from the Latin root/ “educare” or
a o i: (b) (©)
educere"/ which means “to pour in”.Asessuent oF Lames Ourcous (Asessuc 1)
i, / the focus/ of instruction/ was learning
—— 6 in oe tt fost of an i
—— 7. “Pagbibigay sa mag-aaral ng kaalaman at pang-unawa/
&
tungkol sa tao, kapalgiran at lipunan”/ is an example/
) Si
of learning outcomes.
"o)
—
eer eenneceereers roe + ananeeSeesanESOP SEEDY
—— & Ability to communicate/ in writing and speaking/
fa) (b)
is an example/ of deferred outcome.
© @
— ‘The content a the outcome! are the two! main elements
Of the educative process.
“Nailalrawan ang sailing bubay/ simula #4 passlang
a
hanggang sa kasalukuyang edad”/ is an example/
of educational objetve =
(Chapter 1 Shit vation! Focus tom Contant Leung Oveones
B. The following are educational objectives for the subject
science (K to 12). For every educational objective,
Elementary
formulate twe learning outcomes:
Educational Objectives
Learning Outcomes
T. To provide instruction that will
enable the pupils to understand their
immediate physical environment
by using their senses, questioning,
sharing ideas and identifying simple
cause-and-effect _ relationships.
(Cognitive objective)
1.1. The pupils ean
1.2.The pupils can
2. To equip the pupils with the skill
to conduct guided investigation
by following a series of steps
that includes making and testing
predictions, collecting and recording
data, discovering patterns and
suggesting possible explanations.
(psycho-motor objective)
2.1. The pupils can
2.2. The pupils can
3. To encourage among the pupils a
deep understanding and appreciation
of the differences of the plant and
animal groups found in the locality.
3.1. The pupils can
3.2. The pupils can _‘Assesovent oF Loam Ourcoves (ASSESSENT 1)
A
C. Differentiate each of the following pairs by explaining the
meaning of each and giving examples for further clarification,
1. Educational Objective and Learning Outcome
3. Content and Learning Outcome
i eee eee
sie eeesanncanEP eee aeSOO ARPT 7O
4. Student-Centered Instruction and Content-Centered
Instruetion
5. “to develop communication skills” and “can communicate
orally and én writing”.
MEASUREMENT, ASSESSMENT
AND EVALUATION IN
OUTCOMES-BASED EDUCATION
With the change of focus in instruction
came the need to redefine and clarify the terms
used to determine the progress of students towards attainment of
the desired leaning outcomes. These are measurement, evaluation
and assessment.
2.1. Measurement
Measurcmeat is the process of determining or describing
the attributes or chardcteristics of physical objects generally in
terms of . When we measure, we use some standard
instrument to find out how long, heavy, hot, voluminous, cold,
fast or straight some things are. Such instruments may be
ruler, scale, the:mometer or pressure gauge.
To measure is to apply a st
measuring device to an object, group of objects, events or
situations according to procedure determined by one who is
skilled in the use of such device.
Sometimes, we can measure physical quantities by
combining directly measurable quantities to form derived
‘quantities. For example, to find the area of a rectangular piece of
paper, we simply multiply the lengths of the sides of the paper.
In the field of education, however, the quantities and qualities
of interest are abstract, unseen and cannot be touched and so
the measurement process becomes difficult; hence, the need to
specify the learning outcomes to be measured.
For instance,
The same concept can
15-Assesenent oF Leave Outcoues{Assessuei 1)
Measurements can therefore be hfe ID)
or si |. In the example cited, testing
produces objective measurements while expert ratings provide
subjective measurements
in the sense that repeated
measurements of the same quantity or quality of interest
will produce more or less the same outcome. For this reason
many people prefer objective measurements over subjective
‘measurements whenever they are available, However, there are
certain facets of the quantity or quality of interest that cannot
‘be successfully captured by objective procedures but which can
tbe done by subjective methods e.g. aesthetic appeal of a product
fof project of a student, student's performance in a drama, ete. It
follows that it may be best to use both methods of assessment
whenever the constraints of time and resources permit.
Whether one uses an objective or subjective assessment
procedure, the underlying principle in educational measurement
is summarized by the following formula:
Measurement of Quantity or Quality of Interest = True
value plus random errot
ich measurement cf the quantity of interest has two
compotients: a true value of the quantity and a random error
component. The objective in educational measurement is to
estimate or approximate, as closely as possible, the true value of
the quantity of interest. e.g. true knowledge of the subject matter.
This is a tall order and one which will occupy most of our time
in this particular course.
Objective measurements are measurements that do not
ae ol ee taking the measurement, the aa
‘measurement values:should be obtained when using an objective
assessment procedure. In contrast, subjective measurements often
differ from one assessor to the next even if the same quantity or
uglty #s being measured.
2, Measuring Indicators, Variables and Factors
‘An educational variable (denoted by an English alphabet,
like X) is a measureable characteristic of a student. Variables may
be directly measureable as in X = age or X = height of a student
However, many times, a variable cannot be directly measured like
when ‘ve want to measure “class participation” of a student. For
those variables where direct measurements are not feasible, we
introduce the concept of indicators.Chapr 2- Messiaen, Assassment and Evaluation in Outcomes Based Eaton
An indicator, 1, denotes the presence or absence of a
measured characteristic, Thus: .
° [= Lif the characteristic is present
= 0, if the characateristic is absent
For the variable X= class participation, we can let 1, !,
+ 1, denote the participation of a student in n class recitations
and let X = sum of the I’s divided by n recitations. Thus, if there
were n = 10 rec’tations and the student participated in $ of these
10, then X = 5/10 or 50%.
Indicators are the building blocks of educational
taeasurement upon which all other forms of measurement are
built. A group of indicators constitute a variable. A group of
variables form a construct or a factor. The variables which form
4 factor correlate highly with each other but have low correlations
with variabies in another group,
Exampie: The following variables were measured in a
battery of tests.
XI = computational skills
4X2 = reading skills
23 = vocabulary
X4 = logic and reasoning
X5 = sequences and series
X6 = manual dexterity
These variables can be grouped as follows:
Group 1: (XI, X4. XS) = mathematical ability
Sactor
Group 2 : (X2, x3) = language ability factor
Group 3: (6) = psychomotor ability factor
The first group is called a “mathematical ability” factor,
the second group is called a “language ability” factor while
the third group (with only one variable) is called a “psychomotor
ability” factor.
In educational measurement, we shall be concerned
with indicators, variables and factors of interest in the field of
education.Assessuent oF Lane Ovrcous (Asessvent t)
Assessment is the process of
uch evidences of learning can
forms of dialogue record, journals, written work, portfolios, tests
and other learning tasks. Assessment requires review of journal
entries, written work, presentation, research papers, essays, story
written, test results, ete
Assessment results show the more permanent leaming and clearer
picture of the student’s ability,
Assessment of skill attainment is relatively easier than
assessment of understanding and other mental ability. Skills can
be practised and are readily demonstrable. Either the skill exists
at a ceriain level or it doesn't, “Assessment of understanding is
Much more compiex. We can assess a person's knowledge in
a number of ways but we need to infer from certain indicators
of understanding through written descriptions. Assessment of
learning outcomes will be treated in a separate chapter.
2.3. Evaluation
Evaluation originates from the rootword “value” and so
when we evaluate, we expect our process to give information
vegerding the worth, appropriateness, goodness, validity o1
legality of something for which a reliable measurement has been
made.
Objects of evaluation include instructional programs, school
Projects, teachers, students, and educational goals. Examples
include evaluating the “education for all” project of a school
district, the comparative effectiveness of two remedial reading
Programs, correlation between achievement test results and
diagnostic test results, and attributes of an effective teacher.FEEGSEE GERrES a
FEee
lin
EPSESES
ae REESE
‘Chal 2~ Mesure, Assesment and Evaluation in Outcomes asad Education
Evaluations are aften givided into two broad categorie
formative and summative.
The results of formative
evaluation give opportunities to the proponents, learners and
teachers how well the objectives of the program are being
attained. Its
main objective is to determine deficiencies so that
the appropriate interventions can be done. Formative evaluation
may also be
used in analyzing learning materials, student leaming,
and achievements and teacher effectivene:
struments used to ta for summative
evaluation are questionnaire, survey forms, interview/observation
+ guide and tesis. Summative evaluation is designed to deterinine
the effectiveness of a program or activity based on its avowed
Purposes, Scriven gave as techniques for summative evaluation:
pretest-posttest with one group; pretest-posttest with experimental
and control groups: one group descriptive analysis. The subject
of evaluation is wider than assessment which focuses speciffcally
on student learning outcomes.
2.4, Exercises
A. Each of the following statements refers either to (a)
‘measurement, (b) assessment, or (c) evaluation, On the
blank before each number write the letter corresponding to
your answer:
1
Over-all goai is to provide information regarding
the extent of attainment of student learning
outcomes,
Can help educators determine the success factors
of academic programs and projects.
Uses such instruments as ruler, scale or
thermometer.
Used to determine the distance of a location.
Process designed ‘o aid educators make judgment
and indicate solutions to academic situations.Assesuenr oF Lene Outcomes (ASSESSMEC 1)
6. Can determine skill attainment easier than
attainment of understanding,
7. Process of gathering evidence of student
competencies/skills over a period of time.
8, Results show the more permanent learning and
clear picture of student's ability
9. Objects of study may be instructional programs,
school projects, teachers, students or tests
. results.
10. Usually expressed in quantities.
B. List down three (3) activities or processes involved in each of the
following:
1. Measurement
(a)
iz (),
©. ete
2, Assessment
,
) gaia eee
©.
3. Evaluation
Aechmea
Cas te
©.
C. Differentiate each of the following pairs; examples may be cited
to further clarify the meaning.
1. Assessment and Evaluation
3. Mental skill and Manual skill
|THE OUTCOMES OF
STUDENT LEARNING
51. Panenaamesionineneascindont
S
The shift of focus in education from content to student
earning outcomes has changed teachers’ instructional perspective.
In the past, teachers were often heard about their concern to finish
their subject matter before the end of the term. Maybe because of
the number of their stedents or failure to clarify the desired learning
outcomes, teacher’s concern for outcomes was secondary to the
completion of the planned content for the subject.
et oe cue as te pope What
competencies, knowledge or other characteristics should the graduates
for passers possess?
educational objectives.
can...” and completing the statement by using whenever possible
3.2. The Three Types of Learning
Believing that there were more than one (1) type of i
sua:peychomotn refering fo mammal or plies. These terms
were regarded as too technical by practicing teachers and so the
donins were translated to sinplr tems commonly sed by teaches
).
These domains are organized into categories or levels and
aAsses oF Leva Oureowes (Asse 1) i
2 —— $$ |
*
To ensure that the learning outcomes are
measurable, demenstrable and verifiable, the outcomes should be ;
| tated 38 coneete and active verb ;
; 1e two most prominent of these are (a)
changing the names in the six subdivisions from noun to verb and
(b) slightly re-arranging the order.
ra
~__|_ 33. DOMAIN I: Cognitive (Knowledge) +”
jing Outec
CategoriesiLevels Outcomes Verbs ee
14 recall define, describe, identify, ‘the multiplication
of learned | label, match, list, name, | tables; match the word with }
information dutline, recall, recognize, | the parts of the picture of a
reproduce, select, state sewing machine
12 certo: distinguish, estimate, Explain in one's own words |
‘comprehending the explain, give example, the stages in the life oycle
meaning, translation | interpret, paraphrase, of a buttery
and interpretation of summarize the different geometric
instructions; state a figures, j
problem in one's own |
word
13 using what | Apply, change, compute, | Use @ mathematical |
was leamed in the construct, demonstrate, formula to solve an algebra
‘classroom into similar discover, modify, prepare, problem; prepare daily
new sitations produce, show, solve menus for one week for a
family of six
1.4 Analyzing: separating analjze, compare, contrast, | Observe a classroom and
materials or concept isgram, differentiate, list down the things to be 1
into component oarts to | distinguish, illustrate, improved the 1
understand the whole * | outine, select parts of a tree i
1.5 Evaluating: judging the | Compare, conclude, Defend a research |
value of an idea, object | criticize, eque, defend, | proposal, select the most |
or material evaluate, relate, support, | elective soluton; a
justify class cemonstration |
1.6 Creating: building a Categorize, combine, Compile personal records |
structure or pattem; | compli, compose, devise, | and documents info a |
putting parts together design, plan, organize, write a syllabus
revise, rearrange, generate, | for a school subject
modify |
: ‘ Seface pee po
Sree wm
Chapter 3 ~The Oucomes of Sant Leaing
2
‘A
Analyzing
Applying
Understanding
RememberingAssess oF Leon Ourcoues (ASsEssue 1)
4 f
j
3.4, DOMAIN II: Psychomotor (Skills) |
|
Development of these skills requires constant practice in i
accuracy and speed. Simpson contributed 7 categories, Dave 5 |
categories and Harrow 6 categories. They have been reorganized and |
simplified into 4 categories or levels. i
a Learning Outcomes
Categories/Levels Outcomes Verbs Ro j
2.1 Observing: active Walch, delect, distinguish, | Detect non-verbal |
mental attention to a | differentiate, describe, ‘communication cues; watch |
physical activity relate, select ‘a more experienced person;
‘observe and read directions i
22 Imitating: attempt begin, explain, move, Show understanding and |
io copy a physical display, proceed, react, do sequence of steps with
behavior state, volunteer assistance; recognize one's
‘imitations
23 Practising: performing | bend, calibrate, construct, | Operate quickly and
a specific activity. differentiate, dismantle, accurately; display |
repeatedly fasten, fix, grasp, competence while i
‘rir indie, measure, performing, performance is i
mix, operate, manipulate, moving towards becoming 1
mend automatic and smooth. |
24 : fine tuning | organize, relax, shorten, Perform automaticaly; |
the skil and making | sketch. writ, re-arrange, | construct a new scheme! |
minor adjustments to design, ‘sequence; skill in
attain perfection New situation; anew }
foutine, develop a new
programChap 3 ~The Outcomes of Sent Leaning
Practising
E‘Acsesgue oF Lens Outoones(ASSESOUENT 1) ,
Fy
3.5. DOMAIN III: Affective (Attitude)
‘The affective domain refers to the way in which we deal
levels as the person progresses towards internalization in which
the attitude or feeling consistently guides or controls a person's
behavior.
‘ Learning Outcomes
Categories/Levels Outcomes Verbs Seba
3.1 Receiving: being select, point to, sit, choose, | Listen to others with
aware of sensitive to | describe, follow, hold, respect, try to remember
something and being ‘identify, name, reply profile and facts
wiling to listen or pay
attention
3.2 Responding: showing | answer, assist, and, comply, | Participate in discussions,
commitment to conform, discuss, greet, gives expertation; know the
respond in some help, perform practice, read, | rules and practice them;
measure to the idea or} recite, repor, tell, wite question concepts in order
phenomenon to understand them well
3.3 Naluing: showing complete, demonstrate, Demonstrate belie in th
wilingness to be differentiate, explain, follow, |. concept or process;
perceived as valuing or | invite, join, just, propose, | ability to resove
favoring certain ideas | report, share, study,
perform
3.4 Organizing: arranging arrange, combine, complete, | Accept responsibilty,
values into priorities, | adhere, alter, defend, recognize the need for
creating @ unique explain, formulate, integrate, | balance between freedom
value sysiem by organize, relate, synthesize | and responsible behavior,
comparing, rotating and explain how to plan to
synthesizing values salve problem; prioritize
time effectively for family,
work and personal fe
problemsiconficts propose
plan for improvement,
inform management!
' supervisor on matters that
need attentionChapt 3~ The Outcomes of Sunt Leung
a
3 Learning Outcomes
Categories/Levels Outcomes Verbs nee
35 Intemalizing: practicing | act, display, influence, "Show self-reliance when
value system that conto | listen, discriminate, listen, group
‘one’s behavior; exhibiting | modify, perform, revise, re
behavior tha is consisted | solve, verity ‘objectivity in problem-
pervasive, predictable solving; revise judgment
and characteristics of the in light of new evidences,
person value people for what they
are and not for how they
look
Internalizing
Valuing
Responding
Receiving2B
Exercises
outcome belongs.
‘A. The following are examples of leaming outcomes; on the second
column, write the domain in which each outcome is classified
and on the third column the level /category to which the learning
Domain
Tevelategory |
1. Formulate @ procedure
to follow in preparing
for class demonstration
2. Formulates new
program
3. Perform repeatedly
with speed and
accuracy
4. Listen to others with
fespedt
5. Select the most
effective among a
number of solutions
6. Watch a more
experienced performer
7. Know the rules and
practice them
8. Show abiity o resolve
problems/conficts
9. Apply learning
principles in studying
pupil behavior
410. Recite prices of
commodities from
memoryeee
(Chapter 3~The Outomes of Suet Learing
B. Using the indicated topic or subject matter, write learning
outcomes for each of the 3 domains arranged from the simplest
to the most complex level or category.
1.Cognitive: Topic - Investigative Project in Biological Science
1.1. Remembering
1.2. Understanding
1.3. Applying
1.4, Analyzing 7
1.5.Evaluating
1.6.Creating
2.Psychomotcr: Topic - Table Setting
2.1. Observing
2.2.1mitat
2.3. Practicing
2.4. Adupting
3. Affective: Topic - Developing and Nurturing Honesty
3.1 Receiving
3.2 Responding
3.3 Valuing
3.4 Organizing
3.5 Internalizing
29Craorer 4,
30
ASSESSING STUDENT
LEARNING OUTCOMES
1
tee coe aT si re should be a clear statement on
the kinds of learning that the institution values most for its
students. e
iis i ei alignment ensures clear, shared and
aa. lementable objectives.
‘tse arse nota © approach is to design assessment activities which
are observable and less abstract such as “to.determine
the student's ability to write a paragraph” which is more
observable than “to determine the student's verbal ee 7
‘These are supporting student
activities.2
os
(Chapt 4 Assassin Scent Learing Outomes
42. Samples of Supporting Student Activities:
SR: Seles oo ein irs
from secondary sources as basis of a research topic.
L.1.- practise differentiating source material and one’s opinion
reading articles and formulating an original paragraph from
quotes, paraphrases and summaries
1.3. writing of essays to develop the topic
1.4, integrating bibliographic entries in appropriate format
Students apply principles of
logical thinking and persuasive argument in writing.
ee forming opi the topic
2.2. researching and writing about a variety of perspectives
2.3. adapting style to the ‘dentified audience
24. employing clear argument in writing
Students write multiple page
essays complying with standard format and style
3 ey evaluating texts
3.2. writing about a variety of perspectives on single topic
3.3. adapting tone and style to address one’s audience
3.4. reviewing grammar and essey format in readings
3.5. holding group discussion about various topics
3Assessunt or Lexa Outcoues (Assestuent 1)
| ame Assessment should be cumulative because
improvement is best achieved through a linked series of
activities done over time in an instructional cycle.
Institutional Program Subject
ion =| [ee I> | objectives
7
aa ¢
Summative ¢
Assessment of| | 7 Desired Student
Outcomes 7 Learning Outcomes
[Mastery Learning] I
Diagnostic
‘Assessment |
Review/Reteach i
Deciding on
Lesson Focus
Formaiive
Assessraent 2
Outcomes
SS eee
Student Activities
4.3. The Outcomes.Assessment Phases in th(e.g. multiple choice, true/false,
matching, simple recall).
The advantage in using this type is that
with it, although constructing a
be difficult
allow for student individuality and
n although it may not cover an entire range of
knowledge. :
(e.g. reports, papers, research projects, reviews,
etc.) This type allows learning in the process as well as in the
The disadvantage is that plagiarism
may occur and written work is difficult to quantify
. Portfolios may either be longitudinal
portfolio which contains reports, documents and professional
activities or best-case/thematic
portfolio which is specific to a certain topic or theme,
rubric allows students to
are either replicas or simulations of the kind of situation
encountered by adult citizen, consumers or professionals,
‘© emphasis is on a — care :
j «performance is rated in a range
© include specific performance characteristics arranged in
levels or degrees in which a standard has been met.
Rubrics are of two major types: holistic and dimensional/
analytic
| Holistic rubric uses criterion-based standards by providing
descriptions of the different ‘evels of performance like: Most
| Acceptable, Very Acceptable, Acceptable, Barely Acceptable and
UnacceptableAssesavenr oF Louoone Ourcous (Asessuet 1)
EXAMPLE OF HOLISTIC RUBRIC THAT MAKES USE OF
CRITERION-BASED STANDARDS
ASSESSING A RESEARCH REPORT
“Assessment Seale
Degree to which the report
reflects the objectives of
the research
2. Level of creativity
3, Clarity
@._ Visual appeal
5. Level of effort
SUB-TOTALS
TOTAL:
SCORING FROTOCOL:
Most Acceptable: 20 and above
Very Acceptable: 15-19
Acceptable: 10-14
Barely Acceptable: 5.9
Unacceptable: Below 5
Dimensional/analytical rubrit yields sub-scores for each
dimension, as well as a cumulative score which is the sum, either
weighted or unweighted. A dimensional rubric utilizes multiple
indicators of quality for academic tasks that involve more than one
level of skill or ability
— |Chapter 4 Assessing Suen! Leaing Ooms
35
EXAMPLE OF DIMENSIONAL/ANALYTICAL RUBRIC
Assessment of Report on the Analysis of Public Opinions on the Divorce Bill
‘Criteria Qualitative Assessment Scale
‘A. Clarity in defining the issueitopic Levels 0 to 3
B. Level of scholarly research done Levels 0 to 3
C. Aesthetic appeal of report Levels 0 to 3
II]
‘SUM:
‘Assessinent
A. Clarity in Defining the Issue/To,
‘3 The issue was explained in the introductory paragraph,
2 The issue/topic was mentioned in the introductory paragraph but
was not clearly explained,
1. The issueltopic was mentioned in the introductory paragraph but
was not clealy explained,
0 The issuetopic was not mentioned at all
B. Levels of Scholarly Research =
[3 The report lied diferent sources of opinion propery analyzed. ————
2 The report cited different sources. of opinion but not analyzed,
———1 The report cited only one or 2 sources of opinions without analysis.
0 The report did not indicate sources and there was no analysis,
C. Aesthetic Appeal
3” “The report is well witen without errors in grammar and Syntax.
2 The report is well written with lapses in grammar and syntax.
1 The report is writen in incomplete and incoherent sentences.
0. The report is only in outline form:
SCORING PROTOCOL:
Most Acceptable: 7 and above 2
Partially Acceptable: 4.6 .
Unacceptable: Below 4Asses oF Lane Outcomes (Asses 1)
6.6. Competencies/skills Assessment from Beginner to Proficiency
Level .
Skills acquisition undergoes phases from beginner to proficiency
level. This may be illustrated in assessing cognitive and
psycho-motor skills as demonstrated in the combination of
“An adaptation of the Motor Skills Acquisition” by Patricia
Benner applied to the “Assesment of Critical Thinking and of
‘Technological Skills” by Hemon and Dugan.
EXAMPLES OF COMPETENCY/SKILLS ASSESSMENT
(Adapted from Peter Hernon and Robert Dugan, and Patricia Benner)
COMPETENCY: CRITICAL THINKING
Student's name Date
Subject Title — Course:
Students must be able to think critically by performing specific cognitive tasks.
r Not | Not | Beginner | Capable | Competent
Applicable Evident|
1. REMEMBERING. OF OF Q 0 0)
oF ocals omen Recalls Recalls most | Recalls all
and details some ‘content and | significant
content and | details content and
details but | accurately | details ac-
= not always curately
b) Identifies 0 0 Q Q QO
; classification, ‘dentiies | Identifies | Identifies
principles, some most significant
methologies and assi- lassif- Cassif-
theories cations, cations, cations,
Principles | principles, | principles,
ethologies | methologies | methologies
and theories | and theories | and theories
) Restates main oO- 1 0 0 0 0
ideas, concepts Restates | Restates | Restates
and principles main ideas, | main ideas, | main ideas,
concepts | concepts’ | concepts
. and and and
principles | principles. | principles
with difficulty | with minimal | clearly and
assistance | accuratelybt
ia
7
Not] Not | Boginner | Capable | Competent
Applicable | Evident
+] 2. UNDERSTANDING 0 0 0) Q) ()
4) Explains ideas, Explains Explains Explains
concepts and ideas, ideas, ideas,
Principles concepts and] concepts and | concepts and
Principles | principles _| principles
with limited | with some | with accuracy
accuracy and] accuracy | and relevant
imelevant | and relevant | examples
examples | examples
b) Contextualizes 0 0 Q 0 ()
ideas, concepts Coniexuaizes Conientuaizes
and principles ‘ideas, concepts| ideas, concept| ideas, concepts
‘and principles | and principles | and principles
wih dicaty | with minimal | wit ease
| OBECTIVE ] ew
nuwBers | No.) %
1. Knowiedge | Mentiy subjectverd | taste | 5 | 1e67%
2. Comprehension | Forming appropriate | 2,46,8,10 3 | 1667%
ver forms
3 pplication | Determining subject | 187817, | 8 | T8BT%
and predicate 19
Tas Formulating Tues T2618 | 6a
on agrooment 20
5 Synthess? | Wtng of serianoes | Pan 1 Onis] 3557%
Evaluation —_| observing rules on
subjct-vero agreement
t Toa 0]
In the table of specifications we see that there are five
items that'deal with knowledge and these items are items
1,3,5,7,9. Similarly, from the same table we see that five items
represent synthesis, namely: 12, 14, 16, 18, 20. The first four
levels of Bloom's taxonomy are equally represented in the test
while application (tested through essay) is weighted equivalent
to ten (10) points or double the weight given to any of the
first four levels, i
‘ec Rope est As act, eng
test that will be constructed by the teacher will be more or less
comprehensive. Without the table of specifications, the tendency
for the test maker is to focus too much on facts and concepts at
the knowledge level,(Chapter - Deven of are Assessrant Tc: Knodgc ar Rensoning
Perea one a 5
es
Constructing the test items. The actual construction of the
test items follows the TOS. As a general rule, it is advised that
the actual number of items to be constructed in the draft should
be double the desired number of items, For instance, if there
are five (5) knowledge level items to be included in the final
test form, then at least ten (10) knowledge level items should
be included inthe draft. The subsequent test ty-out and item
items in the draft (either they are Sine ifficult, too easy or non-
discriminatory), lence, it will be necessary to construct more
items than will actually be included in the final test form.
bode )
1:
and So on. A student who knows nothig of the content of the
examination would have 50% chance of getting the correct
answer by sheer guess work. Although correction-for-guessing
foac| ‘onus exis itis best hat the eaher ensures that tefl
ve : ene eae Ise test can
ims offset the effect of guessing by requiring students to explain
their answer and to disregard a correct answer if the explanation
be is incereet. Here are some rules of thumb in constructing true-
false items
ent
[the Ce ig ules eto
Ee Example. The Philippines gained its independence in
al 1898 and therefore celebrated its centennial year in
2000. _
: Obviously, the answer is FALSE because 100
ke years from 1898 is not 2000 but 1998.
at Rule 2. Avoid using the words “always”, “never”,Assessvent oF Lee Outcoues (Assan 1)
ee anne eee
Example: Christmas always falls on a Sunday because
it is a Sabbath day.
Statements that use the word “always” are almost
always false, A test-wise student can casily guess his
way through a test like these and get high scores even
if he does not know anything about the test.
Example: Tests need to be valid, reliable and
useful, although, it would require a great amount of
time and effort to ensure that tests possess these test
characteristics.
Notice that the statement is true. However,
we are also not sure which part of the sentence is
deemed truc by the student. It is just fortunate that in
this case, all parts of the sentence are true and hence,
the entire sentence is true. The following example
illustrates what can go wrong in long sentences:
Example: Tests need to be valid, reliable and useful
since it takes very little amount of time, money and
effort to construct tests with these characteristics.
The first part of the sentence is true but the
second part is debatable and may , in fact, be false.
Thus, a “true” response is correct and also, a “false”
response is correct.
student who does not know the subject matter may detect
this strategy and thus get the answer correctly
Example: True or False. The Principle of our school
is Mr. Albert P, Panadero.
“The Principal's name may actually be correct
but since the word is misspelled and the entire
sentence takes a different meaning, the answer wouldChap ~Omrcpent ot Vared Assessment To: Knowle and Restoring
Sa eee een ae
be false! This is an example of a tricky but utterly
useless item.
textbooks. This practice sends the wrong signal to the
students that it is necessary to memorize the textbook word
for word and thus, acquisition of higher level thinking skills
is not given due importance.
Students ee aes ‘arn that strongly worded statements are
‘more likely to be false than true, for example, statements
with “never” “no” “all” or “always.” Moderately worded
Statements are more likely to be true than false. Statements
with “many” “often” “sometimes” “generally” “frequently”
or “some” should be avoided,
Rule 7. With true or false questions, avoid a grossly
cre tr n oes fea
eae
5.4, Multiple Choice Tests |
‘A generalization of the true-false test th multiple choice
Each item in a multiple choice test
consists of two parts: (a) the stem, and (b) the options, In the
set of options, there is a option while all the
others are considered “distracters”.
It is this feature of
order thinking kil even if he options ae clearly sated. Asin
true-false items, there are certain rules of thumb to be followed
in constructing multiple choice tests.Assesaer oF Leamans Ourcones (Assessnent 1)
a
p ability
Of the item to discriminate or its level of difficulty should
stem from the subject matter rather than from the wording
of the question.
Example: What would be the system reliability of a
computer system whose slave and peripherals are
connected in parallel circuits and each one has a
known time to failure probability of 0.05?
+ A-student completely unfamiliar with the terms
“slave” and “peripherals"may not be able to answer
correctly even if he knew the subject matter of
reliability.
Example:
Much of the process of photosynthesis takes place in
the:
a. bark
b. leat
stem
The qualifier “much” is vague and could have been
replaced by more specific qualifiers like:" 90% of the
Photosynthetic process” or some similar phrase that would
be more precise.
comprehension difficulties.
Example:
(Poor) As President of the Republic of the Philippines,
Corazon Cejuangeo Aquino would stand next to
which President of the Philippine Republic subsequent
to the 1986 EDSA Revolution?
(etter) Who was the President of the Philippines after
Corazon C, Aquino?(Shaper 5 ~ Daven of VafedAssssmant To: Knowledge and Reasoing
grammatical constriction,
Example:
(Poor) Which of the following will not cause inflati
Philippine economy?
(Better) Which of the following will cause inflation in the
Philippine economy?
Poor: What docs the statement “Development pattems
acquired during the formative years are NOT
Unchangeable” imply’?
&
B.
c.
D.
Better: What docs the statement “Development pattems acquired
during the formative years are changeable” imply?
A
in the
B
c
D.
2.
Example:
The short story: May Day's Eve, was written by which
Filipino author?
a, Jose Garcia Villa
b. Nick Joaquin
¢. Genoveva Eidrosa Matute
4. Robert Frost
. Edgar Allan Poe
If distracters had all been Filipino authors, the value of the
item would be greatly increased, In this particular instance, only
the first three carry the burden of the entire item since the last
two cau be essentially disregarded by the students,Assesawew oF Levan Ovrcoues (Aesessien t)
— ee
7) All multiple choice options should be grammatically
correctness of the answer: The following is an example of
thi
tise rule:
Example:
If the three angles of two triangles are congruent, then
the triangles are:
4. congruent whenever one of the sides of the
triangles are congruent
b. similar
© equiangular and therefore, must also be congruent
4. equilateral if they are equiangular
The correct choice, “b,” may be obvious from its
‘ength and” explicitness alone. The other choices are long
and tend to explain why they must be the correct choices
forcing the students to think that they are, in fact, not the
correct answers!
2) i te ttl hae anh a
"aceon
Example:
What causes ice to transform from solid state to liquid
state?
@. Change in temperature
. Changes in pressure
© Change in the chemical composition
4. Change in heat levels
ly the same.
Thus, a student who spots these identical choices
would right away narrow down the field of choices
toa, b, and ¢. The last distracter would play no
significant role in increasing the value of the item.STS
Chaplet 5 ~Devlpmen of Vased Assessment Tacs: Knowledge and Reasoning
|
The item's
value is perticularly damaged if the unnecessary material is
designed to aistract or mislead. Such items test the student's
reading comprehension rather than knowledge of the subject
‘matter
Example:
The side opposite the thirty degree angle in a right
triangle is equal to half the length of the hypotenuse.
If the sine of a 30-degree is 0.5 and its hypotenuse is
‘5, what is the length of the side opposite the 30-degree
angle?
22s
b.3.s
6.55
ds
The sine of a 30-degree angle is really quite
unrecessary since the first sentence already gives the
method for finding the length of the side opposite the
thirty-degree angle. This is a case of a teacher who wants
to make sure thet no student in his class gets the wrong
answer!
Note in the previous example, knowledge of the sine
of the 30-degree angle would have led some students to use
the sine formula for calculation even if a simpler approach
would have sufficed.
15) Avoid exireme specrficty requirements. in responses
‘ee
oT‘hen choice of the
“best” response is intendeds “none of the above” is not
‘appropriate, since the implication has already been made
‘that the correct response may be partially inaccurate.
In a multiple option item, (allowing
‘only one option choice) if a student only knew that two (2)
‘options were correct, he could then deduce the correctness
of “all of the above". This assumes you are allowed only
‘one correct choice.
g
z
(Less Homogeneous)
Thailand is located in:
a."Southeast Asia
b. Easter Europe
. South America
4. East Africa
©. Central America
(More Homogeneous)
Thailand is located next to:
4. Laos and Kampuchea
b. India and China
. China and Malaya
4. Laos and China
. India and Malaya(Chepar5- Developm of Va Assessment Tous: Krowege and Reasoning
a
5.5. Matching Type and Supply Type Items
‘The matching type items may be considered as modified
‘multiple choice type items where the choices progressively
‘educe as one sucesflly matches the items onthe Left with
Example: Match the items in column A with the items in
column B,
A B
__1. Magellan a. First President of the Republic
2. Mab b. National Hero
__3. Rizal ¢. Discovered the Philippines
__4. Lapu-Lapu 4. Brain of Katipunan
__5. Aguinaldo ¢. The great painter
f. Defended Limasawa island
{ype items, unfortunately, often test lower order thinking skills
7 A variant of the matching type items is the data sufficiency
and comparison type of tes illustrated below:
Example: Write G if the item on the left is greater
than the item on the right; L if the item on the left is less than
the item on the right; E if the item on the left equals the ite:n on
the right and D if the relationship cannot be determined.
A B
1. Square root of 9 a3
2. Square of 25 ae
3. 36 inches ¢.3 meters
4.4 feet 4. 48 inches
5.1 kilogram 1 pound
The data sufficiency test above can, if properly constructed,
test higher order thinking skills. Each item goes beyond simple
recall of facts and, in fact, requires the students to make
decisions.
Another useful device for testing lower order thinking skills
is the supply type Of tests: Like the multiple choice test, theAssssuent oF Leone Ourcous (Asessuea 1)
: The study of life and living organisms is
Supply type tests depend heavily on the way that
the stems are constructed. These tests allow for one and
only one answer and, hence, often
Itis , however, possible to construct supply type
of tests that will test higher order thinking as the following
example shows:
Example: Write an appropriate synonym for each of the
following. Each blank corresponds to a letter:
Metamorphose: _
Flourish: _
The appropriate synonym for the first is CHANGE with
six(6) letters while the appropriate synonym for the second is
GROW with four (4) letters, Notice that these questions’ require
not only mere recall of words but also understanding of these
words.
tests, students are required to write one or more paragraphs on
a specific topic,
Essay questions can be used to measure attainment of a
varity of objectives. Steklein (1988) has listed 1 ypes of
1. Comparisons between two or more things
2. The development and defense of an opinion
3. Questions of cause and effect
4, Explanations of meanings
5. Summarizing of information in a designated area
6. Analysis,
7. Knowledge of relationships‘Captor 5 - Developmen. Varied Assessment Ts: Knowledge and Reasoning
8. Illustrations of rules, princi
applications
9. Applications of rules, laws, and principles to new
situations
10, Criticisms of the adequacy, relevance, or correctness
‘of a concept, idea, or information
11, Formulation of new questions and problems
12, Reorganization of facts
13. Diseriminations between objects, concepts, or events
14. Inferential thinking
les, procedures, and
Example: Write an essay on the topic: “Plant Photosynthesis’
Using the followirg keywords and phrases: chlorophyll,
sunlight, water, carbon dioxide, oxygen, by-product,
stomata,
Note that the students are properly guided in
terms of the keywords that the teacher is looking for
* in this essay examination. An essay such as the one
given below will get a score of zero (0). Why?
Plant Photosynthesis
Nature has its own way of ensuring the balance
between food producer and consumers. Plants are considered
producers of food for animals, Plants produce food for animals
through a process called photosynthesis. It is a complex
process that combines various natural elements on earth into
the final product which animals can consume in order to survive.
Naturally, we all need to protect plants so that we will continue
0 have food on cur table. We should discourage burning of
grasses, cutting of trees and. illegal logging. If the leaves of
plants are destroyed, they cannot perform photosynthesis and
animals will also perish.62
‘Assessuent oF Les Ourcous (Assen 1)
a
This rule allows the students to
focus on relevant and substantive materials rather
than on peripheral and unnecessary facts and bits of
information.
Example: Write an essay on the topic: “Plant Photosynthesis”
using the keywords indicated. You will be graded
according to the following criteria: (a) coherence,
(b) accuracy of statements, (c) use of keywords,
(d) clarity and (e) entra points for innovative
presentation of ideas.
i
This procedure
also helps offset the in grading. When all
of the answers on one paper are read together, the
grader’s impression of the paper as a whole is apt
to influence the grades:he assigns to the individual
answers. Grading question by question, of course,
prevents the formation of this overall impression
of a student’s paper. Each answer is more apt to
be judged on its own merits when it is read and
cothpared with other answers to the same question,
than when it is read and compared with other answers
by the same student.
|
. Answers to
‘essay questions should be evaluated in terms of what
is written, not in terms of what is known about the
writers from other contacts with them. The best way
ee(Chapa Deven of Vare Assossnrt Tc: Krwledge and Reasoning
————
This can be done sy having
the students write their names on the back of the
Paper or by using code numbers in place of names.
le
_ er tees Saas
Ithough this may not
be a feasible practice for routine classroom testing,
it might be doue periodically with a fellow teacher
(one who is equally competent in the area). Obtaining
two or more independent ratings becomes especially
vital where the results are to be used for important
and irreversible decisions, such as in the selection
of students for further training or for special awards,
Here the pooled ratings of several competent persons
may be needed to attain level of reliability that is
commensurate with the Significance of the decision
being made.
Some teachers use the cumulative criteria ic.
adding the weights given to each criterion, as basis
for grading while others use the reverse. In the
=a
63Ascessuent oF Leaman Ourooves (Assessucr 1)
5.7, Exercises
Let's have som> mental exercises to test your understanding.
EXERCISE |
A. Give an example to illustrate each of the following rules of
“thumb in the construction of a true-false test:
1. Avoid giving hints in the body of the question.
2. Avoid using the words “always”, “never” and other
such adverbs which tend to be always true or always
false. *
3. Avoid long sentences which tend to be true. Keep
sentences short.
4. Avoid a systematic pattern for true and false
statements,
5. Avoid ambiguous sentences which can be interpreted
as true and at the same time false.
3B. Give an example to illustrate each of the following rules of
thumb in the construction of multiple choice tests:
1. Phrase the stem to allow for only one correct or best
answer.
2. Avoid giving away the answer in the stem,
Choose distracters appropriately.
4. Choose distracters so that they are all equally plausible
atid attractive.
5. Phrase questions so that they will test higher order
thinking skills.
6. Do not ask subjective questions or opinions for which
there are no right or wrong answers,hapa -Donpn Aten Tk rode Reig
: ———$______ 65
EXERCISE II
A. Construct a 10-item matching type test on the topic: “Plant
Photosynthesis”
B. Construct a 10-item supply type test on the topic: “The
discovery of the Philippines”
C. Justify each rule used in constructing an essay type of
test.
)- Construct « 10-item data sufficiency test.
E. In a 100-item test, what types of objective tests will you
include? Justify your answer,
F. In the sample essay “Plant Photosynthesis” given in this
section, why would you give a zero (0) score to the student
writing this essay? Justify your answer.
G. Give an example of a supply type of test that will measure
higher order thinking skills (beyond mere recall of facts and
information.)
H. In what sense is a matching type test a variant of a
multiple choice type of test? Justify your answer,
1. In what sense is a supply type of a test considered a variant
of multiple choice type of test? (Hint: In supply type, the
choices aze not explicitly given). Does this make the supply
type of test more difficult than closed multiple choice type
of test? How? *Coarrer
ITEM ANALYSIS
AND
VALIDATION
- Introduction
‘The teacher normally prepares a draft of the test. Such a draft
is subjected to item analysis and validation in order to ensure that
the final version of the test would be useful and functional. First,
the teacher tries out the draft test to a group of students of similar
characteristics as the intended test takers (try-out phase). From the
try-out group, cach item will be analyzed in terms of its ability to
discriminate between those whe know and those who do not know
and also its levei of difficulty (item analysis phase). The item analysis
will provide information that will allow the teacher to decide whether
to revise or replace an item (item revision phase). Then, finally, the
final draft of the test is subjected to validation if the intent is to make
use of the test as a standard test for the particular unit or grading
period. We shall be concerned with these concepts in this Chapter.
6.1. Ttem Analysis
‘There are two important characteristics of an item that will
be of interest io the teacher. These are: (a) item difficulty, and
() discrimination index. We shall learn how to measure these
characteristics aud apply our knowledge in making a decision
about the item in question.
The difficulty of an item or item difficulty is defined as the
number of students who are able to answer the item correctly
divided by the total number of students. Thus:(Chapler6 -tam rai an Viton,
se
The item difficulty is usually expressed in percentage.
Example: What is the item difficulty index of an item if
25 students are unable to answer it correctly
while 75 answered it correctly?
Hére, the total number of students is 100 , hence, the item
difficulty index is 75/100 or 75%.
One problen with this type of difficulty index is that it may not
actually indicate that the item is difficult ( or easy). A student who
does not know the subject matter will naturally be unable to answer
the item correctly even if the question is easy. How do we decide on
the basis of this index whether the item is too difficult or too easy?
The following arbitrary rule is often used in the literature :
=o,
Difficult items tend to discriminate between those who’ know and
‘those who do not know the answer. Conversely, easy items cannot
discriminate between these two groups of students. We are therefore
interesjed in deriving a measure that will tell us whether an item can
discriminate between these two groups of students, Such a measure
is called an index of discrimination,
‘An easy way \o derive such a mieasure is to meusure how
difficult an item is with respect to those in the upper 25% of the
class and how difficult it is with respect to those in the lower 25% of
the class. If the upper 25% of the class found the item easy yet the
lower 25% found it difficult, then the item can discriminate properly
between these two groups. Thus:
or8
‘Assesent oF Lame Outcoes (Assessnen 1)
Index of discrimination = DU ~ DL .
Example: Obtain the index of discrimination of an item if the
upper 25% of the class had a difficulty index of 0.60 (i.e. 60% of the
upper 25% got the correct answer) while the lower 25% of the class
had a difficulty index of 0.20.
60 while DL = 0.20, thus index of discrimination =
40.
‘Theoretically, the index of discrimination can range from -1.0 (when
and DL = 1) to 1.0 ( when DU = 1 and DL =0). When
the index of discrimination is equal to -1, then this means that all
of the lower 25% of the students got the correct answer while all
of the upper 25% got the wrong answer. In a sense, such an index
discriminates correctly between the two groups but the item itself
is highly questionable. Why should the bright ones get the wrong
answer and the poor ones get the right answer? On the other hand,
if the index of discrimination is 1.0, then this means that all of the
lower 25% failed to get the correct answer while all of the upper
25% got the correct answet. This is a perfectly discriminating item
and is the ideal item that should be included in the test. From these
discussions, let us agree to discard or revise all items that have
negative discrimination index for although they discriminate correctly
between the upper and lower 25% of the class, the content of the item
itsclf may be highly dubious. As in the case of the index of difficulty,
‘we have the following rule of thumb:
-1.0~ -50 Can discriminate Discard
but item is questionable
55- 04S Non-discriminating Revise
0.46 - 1.0 Discriminating item Include
Example: Consider a multiple choice type of test of which the
following data were obtained:
Item Options
A Be CoD
1 0 40 30-20. Total
0 15 50 Upper 25%
0 5 10 5 Lower 25%(Chapt ~ tam Anyi ad Vlaton
SSR ee ee ee
The correct response is B. Let us compute the difficulty index and
index of discrimination:
Difficulty Index = no, of students getting correct responsertotal
= 40/100 = 40%, within range of a “good item*
‘The discrimination index can similarly be computed:
DU = no. of students in upper 25% with correct responseino. of students in
the upper 25%
= 18/20 = 75 oF 75%
DL = no, of students in lower 75% with correct response! no. of students in
the lower 25%
5/20 = 25 or 25%
Discrimination Index = DU ~ DL = .75- 25 = .50 of 50%.
.
‘Thus, the item also has a “good discriminating power”.
It is also instructive to note that the distracter A is not an
effective distracter since this was never selected by the students.
Distracters C and D uppear to have good appeal as distracters.
The Michigan State University Measurement and Evaluation
Department reports a number of item statistics which aid in
evaluating the effectiveness of an itent. The first of these is the index
of difficulty which MSU (hetp/vww.msu.edu/dept/) defines as the
proportion of the total group who got the item wrong. “Thus a high
index indicates a difficult item and a low index indicates an easy
item. Some item analysts prefer an index of difficulty which is the
proportion of the total group who got an item right. This index may
be obtained by marking the PROPORTION RIGHT option on the
item analysis header sheet. Whichever index is selected is shown as
the INDEX OF DIFFICULTY onthe stem analyte prntour, Fer
classroom achievement tests, most test constructors desire items with
indices of difficulty no lower than 20 nor higher than 80, with an
average index of difficulty from 30 or 40 to a maximum of 60.
The INDEX OF DISCRIMINATION is the difference between
the proportion of the upper group who got an item right and the
proportion of the lower group who got the item right. This index is70
Assesswayr oF Lexauns Ourcous (ASESSu04 1)
dependent upon the difficulty of an item, It may reach a maximum
Value of 100 for an item with an index of difficulty of 50, that
is, when 100% of the upper group and none of the lower group
answer the item correctly. For items of less than or greater than
50 difficulty, the index of discrimination has a maximum value of
less than 100. Interpreting the Index of Discrimination document
contains a more detailed discussion of the index of discrimination.”
(httpi/wwwmsuedu/dept).
‘More Sophisticated Discrimination Index
‘tem discrimination refers to the ability of an item to differentiate
among students on the basis of how well they know the material
being tested. Various hand calculation procedures have traditionally
been used to compare item responses to total test scores using high
and low scoring groups of students. Computerized analyses provide
more accurate assessment of the discrimination power of items
because they take into account responses of all students rather than
just high and low scoring groups.
The item discrimination index provided by ScorePak® is a
Pearson Product Moment correlation between student responses
to a particular item and total scores on all other items on the test.
This index is the equivalent of a point-biserial coefficient in this
application. It provides an estimate of the degree to which an
individual item is measuring the same thing as the rest of the items.
Because the
an item and the test as a whole
© measuring a unitary ability or
attribute, values of the coefficient will tend to be lower for tests
measuring a wide range of content areas than for more homogeneous
tests. Item discrimination indices must always be interpreted in the
context of the type of test which is being analyzed. Items with low
discrimination indices are often ambiguously worded and should
be examined. Items with negative indices should be examined to
determine why a negative value was obtained. For example, a negative
value may indicate that the item was mis-keyed,"so that students who
knew the material tended to choose an unkeyed, but correct, response
option.
In practice, values of thediscrimination
Chapter 6 hr Anas ane Valitaton
at a
index will seldom exceed .50 because of the differing
shapes of item and total score distributions. ScorePak® classifies
item discrimination as “goo
if the index is above 30; “fair” if it is
between .10 and.30; and “poor” if it is below .10.
* tables presented for the levels of difficulty and discrimination there
is a little area of intersection where the two indices will coincide
(between 0.56 0 0.67) which represent the good items in a test.
(Source: Office of Educational Assessment, Washington DC, USA
‘itp://www.washington.edwoea/services/scanning_scoring/scoring/
item_analysis.html)
At the end of the Item Analysis report, test items are listed
according their degrees of difficulty (easy, medium, hard) and
discrimination
(good, fair, poor). hese distributions provide a quick
overview of the test, and can be used to identify items which are not
performing well and which can perliaps be improved or discarded.
NR,
1. The difficulty of the item
2. The discriminating power of the item
3. The effectiveness of each alternative
1. It provides useful information for class discussion of the
test
It provides data whicn helps students improve their leaming.
It provides
ighis and skills that lead to the preparation of
bettcr tesis in the future,
Ru+Re
x 100
The number in the upper group who answered the
item correctly.
‘The number in the lower group who answered the
item correctly.
The total number who tried the item.
n