0% found this document useful (0 votes)
11 views30 pages

Inbound 5816460744021480108

The document outlines the process of instrument development for research, detailing the creation of questionnaires and interviews. It emphasizes the importance of validity and reliability in research instruments, providing guidelines for constructing effective questions and assessing their consistency. Additionally, it discusses the options of adopting or adapting existing instruments to enhance research efficiency and accuracy.

Uploaded by

Azel Seniddo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views30 pages

Inbound 5816460744021480108

The document outlines the process of instrument development for research, detailing the creation of questionnaires and interviews. It emphasizes the importance of validity and reliability in research instruments, providing guidelines for constructing effective questions and assessing their consistency. Additionally, it discusses the options of adopting or adapting existing instruments to enhance research efficiency and accuracy.

Uploaded by

Azel Seniddo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 30

Instrument

Developme
INSTRUMENT
DEVELOPMENT
What is Instrument Development?
Process of creating tools to collect data in
research

Examples:
• Questionnaire – written set of questions
• Interview Guide – verbal questions
Questionnaire
A written research instrument with pre-
determined questions

3 Parts:

1. General Instructions
2. Profile of Respondent
3. Body / Items
Part 1: General Instructions - the purpose of
giving general instructions is to give the participant
an idea of the purpose of your research and gives
an overview of the topic of your questionnaire.

This part includes:


• Title/Introduction of the researcher -
Introduce the title of your study and you may
give your name and the institution you are
affiliated with. Purpose of the questionnaire -
State the purpose or aim of your study.
Part 1: General Instructions - the purpose of giving
general instructions is to give the participant an idea
of the purpose of your research and gives an overview
of the topic of your questionnaire.

This part includes:


• Title/Introduction of the researcher - Introduce
the title of your study and you may give your name
and the institution you are affiliated with.
• Purpose of the questionnaire - State the
purpose or aim of your study.
• Confidentiality statement - Assure that the
participant's responses will remain
confidential.
• Voluntary participation- Make the
participants aware that they don't have to
complete the questionnaire, or they don't
have to answer any items that they are
uncomfortable with.
• How to submit the questionnaire- Instruct
the participants on how to return the
Part 2: Profile of the respondent - this asks
the personal information of the participant. It
may also be called demographic characteristics
or biodata. Consider which variables are relevant
and capture information only that are important
to your study.

Part 3: Body/Suestionnaire Items - this


contains the questions or items that need to be
answered by the participant.
Consider the following when constructing the
questionnaire items:
• The questions are direct and specific.
• Only one question is asked a time.
Clarify • The participants can understand what is being asked.
• There are no double-barreled questions (two questions in
one).

• Questions are concise.


Wordiness
• There are no unnecessary words.

• Questios are askedzz using affirmative (e.g. Instead of


Negtive Wording asking "Which methods are not used?", the researcher
asks,"Which methods are used?".
Consider the following when constructing
the questionnaire items:
Overlapping • No response covers more that one choice.
Responses • All possibilities are considered.

• The questions are unbiased and do not lead


Balance the participants to a response. The
questions are asked using a neutral tone.

• The terms used are understandable by the


target population.
Use a Jargon
• There are no clichés or hyperbole in the
wording of the questions.
Consider the following when constructing
the questionnaire items:

• The choices listed allow participants to


respond appropriately.
Appropriateness of
• The responses apply to all situations or
Responses Listed
offer a way for those to respond with
unique situations.

Use of Technical • The use of technical language is minimal


Language and appreciate.
Developing Interview guestions appropriate.

An Interview is a research instrument wherein the


researcher verbally asks questions to participants of
the study to answer what the research is trying to look
for.

Order of Interview Questions


First set of questions: Opening questions that
establish friendly atmosphere or mood of the
participant.
Second set of questions: Open-ended
questions that generate the participant's views or
ideas about the topic.

Example: "What ideas do you have about


Distance Learning?"

Third set of questions: Close-ended questions


that elicit specific answers about the research
topic or questions answerable with 'Yes' or 'No'.
Forth set of qmestions: Closing questions that
give the participants a chance to give their views
or comments about the to

Example:"What has been your overall


impression on the new normal in education?"
VALIDITY AND
RELIABILITY OF THE
RESEARCH
INSTRUMENT
Reliability refers to the consistency of a
research instrument, while validity measures how
well it captures what it is intended to measure. An
example is an alarm clock set for 6:30 a.m. but
always ringing at 7:00 a.m. it is reliable but not
valid. Ensuring both validity and reliability is
essential in conducting and evaluating research.
The different attributes of reliability are
described in Table 1 below.
Attributes of Reliability
Attributes Description

Homogeneity (or
The extent to which all the items on ascale
internal
measure one construct
consistency)

The consistency of results using an instrument


Stability
with repeated testing
The different attributes of reliability are
described in Table 1 below.
Attributes of Reliability
Attributes Description

Homogeneity (or
The extent to which all the items on ascale
internal
measure one construct
consistency)

The consistency of results using an instrument


Stability
with repeated testing

Consistency among responses of multiple users


Equivalence of an instrument, or among alternate forms of an
instrument
Homogeneity or internal consistency can be
assessed with the use of split-half reliability,
Kuder-Richardson coefficient and Cronbach's a.

Split-half reliability- Here, the results of a test,


or instrument, are divided in half Correlations are
calculated comparing both halves. Strong
correlations indicate high reliability, while weak
correlations indicate the instrument may not be
reliable.
Kuder-Richardson test- It is a more complicated
version of the split-half test. In this process the
average of all possible split half combinations is
determined and a correlation between 0-1 is
generated. This test is more accurate than the
split-half test but can only be completed on
questions with
two answers (eg, yes or no, o or 1).
Cronbach's a- It is the most commonly used test
to determine the internal consistency of an
instrument. In this test, the average of all
correlations in every combination of split-halves is
determined. Instruments with questions that have
more than two responses can be used in this test.
The Cronbach's a result is a number between o
and 1. An acceptable reliability score is one that is
0.7 and higher.
Stability can be tested with the use of test-retest
and parallel or alternate-form reliability testing

• Test-retest reliability- It is assessed when an


instrument is given to the same participants
more than once under similar circumstances or
situations. A statistical comparison is made
between participant's test scores for each of
the times they have answered it.
Parallel-form reliability (or alternate-form
reliability)- It is similar to test- retest reliability
except that a different form of the original
instrument is given to participants in subsequent
tests. The domain, or concepts being tested are
the same in both versions of the instrument, but
the wording of items is different. For an instrument
to demonstrate stability there should be a high
correlation between the scores each time a
participant completes the test. Generally
There are four types of validity. These are
described in Table 2 below.
Types of Validity
Types Description

The extent to which a research


Content Validity instrument accurately measures all
aspects of a study

The extent to which a research


Construct Validity instrument (or tool) accurately measures
what it intends to measure
There are four types of validity. These are
described in Table 2 below.
Types of Validity
Types Description

The extent to which a research


Criterion Validity instrument is related to other
instruments that measure the same variables

The extent to which a research


Face Validity instrument appears to be valid and
measure what it is supposed to measure
Ways to improve Validity and Reliability:

1. Define your goals and objectives clearly.


2. Match your instrument to your goals and
objectives. Additionally, have the test reviewed by
others to obtain feedback from an external party
who is less invested in the instrument.
3. Assess for troublesome wording, or other
technical errors.
4. Compare your instrument with other measures,
Adopting or Adapting an Instrument

The researcher may also choose to adopt or adapt


pre-existing instruments to save time and effort in
developing a new one. To use an instrument that
another researcher has developed and validated is
a good way to determine that the instruments
used in a research study are both reliable and
valid. This will help you n three ways:
• You can ensure that you have accurately
measured the variables you are studying.
• The significance of your study can be related
to previous research that has already been
conducted in your field.
• It saves time and energy for not having to
develop a new instrument.
Adopting an instrument requires very little effort
and is quite simple. However, there still might be a
few modifications that are necessary even though
an instrument is adopted. When adopting an
instrument, the researcher must include the
following in the instrument description:

• Who developed the instrument?


• Who validated the instrument?
• What are the other studies that have used the
instrument?
Adapting an instrument requires more changes
than adopting an instrument. In this situation, the
researcher follows the general format of another
instrument but adds items, removes items, and/or
changes the content of each item. When adapting
an instrument, the researcher should report the
same information in the Instruments section as
when adopting the instrument, but should also
include the
following in the instrument description:
THANK
YOU

You might also like