FACE VALIDATION TOOL
Checklist Validation Rubric for Expert Panel - VREP©
By Marilyn K. Simon and Jim Goes with input from Jacquelyn White
Dear Evaluator,
This adopted rubric aims to validate the adapted reading test from the study of Suarez
(2016) entitled The Relationship Between Reading Comprehension and Reading Strategies
Used by the Grade 8 Students.
In connection with this, we humbly request your precious time and expertise by
accomplishing the rubric which contains two parts: Part 1 asks for your personal and professional
background and Part 2 asks for your response to the validation of the adapted reading test in
terms of its face and content validity.
The information that you will provide would be helpful for the improvement of the
instrument.
Be assured that your responses would be treated professionally and confidentially.
Respectfully yours,
GUECO, GLEYANN D.
IGLESIA, MAILA R.
HIPOLITO, SHEMA S.
HULIPAS, JAMIE D.
MANALOTO, MARY ANGEL G.
NICASIO, NICOLE T.
PALLASIGUE, AIRA MAE G.
PARAS CARMILYN, P.
Research Team 4-BSEd English 3A
Noted by:
MR. LYNDON Q. MACANAS PROF. WILLIE A. ALAGANO
Research Adviser Research Statistician
PART 1. Personal Profile and Professional Background
Please fill out the following table with the information being asked.
Name
Position/ Designation
Age
Sex
Highest Degree
Major/Specialization
Length of Teaching Experience
Part 2. Evaluation of the adapted Reading Comprehension Test (Suarez, 2016)
In this section, you will find a list of the criteria that you will use to evaluate and validate
the tool. You are requested to put a check mark (/) on the box that indicates your verdict.
Furthermore, you may write your suggestions to improve certain criteria.
SCORE
1 - Not Acceptable (major modifications needed)
2 - Below Expectations (some modifications needed)
3 - Meets Expectations (no modifications needed but could be improved with minor changes)
4Exceeds Expectations (no modifications needed)
Questions NOT meeting standard
(List page and question number) and need to be revised.
Please use the comments and suggestions section to recommend revisions
Questions NOT
OPERATIONAL SCORE meeting standard
CRITERIA
DEFINITIONS
1 2 3 4
The questions are direct and
specific.
Only one question is asked
at a time.
Clarity The participants can
understand what is being
asked.
There are no double-
barreled questions (two
questions in one).
Questions are concise.
Wordiness There are no unnecessary
words
Negative
Wording Questions are asked using
the affirmative (e.g., Instead
of asking, “Which methods
are not used?”, the
researcher asks, “Which
methods are used?”)
No response covers more
than one choice.
Overlapping All possibilities are
Responses considered.
There are no ambiguous
questions.
The questions are unbiased
and do not lead the
Balance participants to a response.
The questions are asked
using a neutral tone.
The terms used are
understandable by the target
Use of population.
Jargon There are no clichés or
hyperbole in the wording of
the questions.
The choices listed allow
Appropriate participants to respond
ness of appropriately.
Responses The responses apply to all
situations or offer a way for
Listed those to respond with
unique situations.
Use of The use of technical
Technical language is minimal and
appropriate.
Language
All acronyms are defined.
The questions asked relate
Application to the daily practices or
to Praxis expertise of the potential
participants.
The questions are sufficient
to resolve the problem in
the study
Relationship The questions are sufficient
to answer the research
to Problem
questions.
The questions are sufficient
to obtain the purpose of the
study.
COMMENTS SUGGESTIONS
Permission to use this survey and include in the manuscript was granted by the authors, Marilyn
K. Simon, Jim Goes, and Jacquelyn White. All rights are reserved by the authors. Any other use
or reproduction of this material is prohibited.
_______________________________________
Evaluator’s Signature Over Printed Name