Content Validity Analysis of First Semester Formative Test On Biology Subject For Senior High School
Content Validity Analysis of First Semester Formative Test On Biology Subject For Senior High School
Abstract. This study aims to analyze the content validity of first-semester formative test
instrument of biology subjects on senior high school in the early stages of developing the good
test instrument. The test was multiple-choice format with a reason which uses for diagnosis of
biology misconceptions. The content validity was determined by quantitative analysis of expert
judgments and qualitative expert reviews. Two quantitative approaches to content validity
estimations (Lawshe’s CVR and Aiken’s V) were compared in the analysis of a newly
developed instrument which consists of 35 items. The data obtained from a panel of five expert
judges. A Content Validity Ratio (Lawshe’s CVR) initially determined that only one item
lacked inter-rater proportion agreement about its essentiality to the test instrument (CVR = -
0,2). The result of further content validity analysis shows three items had low content validity
coefficient (Aiken's V), that is indicating poor item relevance to the test. The qualitative
reviews suggest to give attention to questions stem language to nine items include three items
earlier. The findings supported the revision of nine items.
1. Introduction
Nowadays, the quality of education has focused on public accountability. One of the ways by which
such accountability is measured is by the extent to which students’ performance in teacher-made tests
can predict their potential performance on the standardized tests [1,2]. Teacher-made tests here can be
interpreted as a formative test conducted when the learning is still in progress in the framework of
formative assessment for evaluation. In recent years, assessment is a central issue in the field of
education and often discussed by many stakeholders from grade level in school, regional, national, and
international. Important educational assessment is done to obtain data on the extent to which the level
of educational objectives achievement that is implemented [3]. Assessment of education specifically
refers to the assessment of learning. At the school level, assessment of formal learning refers to the
curriculum which is designed in form of subjects learning to students.
One branch of science used in the learning instructional is biology. The structure of biology science
is containing a lot of conceptual knowledge and becomes the part of the national educational goals or
in the narrow sense is a objectives of learning biology. Campbell & Reece [4] lays out the knowledge
of biology as related knowledge of living things and the ins and outs of his life. Biology encompasses
the knowledge of things simple to complex, concrete things start to abstract with regard to living
things. Biology is the science of matter with coverage of a very broad and deep. On the path to formal
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
education, biology noted in the national curriculum and studied ranging from basic education up to the
level of higher education.
With regard to the learning material, the material complexity can potentially be a constraint
learning achievement of goals, including learning in biology. The demands of mastery the biological
material on the dimensions of knowledge includes knowledge mastery of factual, conceptual,
procedural, and metacognition. In this case, the biology teacher is required to facilitate and guide
students to master concepts, principles, laws, theories, etc. [5]. According to Carin & Sund [6],
conceptual knowledge is scientific products or products Science. In other words, students who study
Biology need to know aspects of the product of science especially in the fields of biology. Because of
the many products of science (biology), in the study of biology is often found the knowledge of
students which are not comprehensive.
In Biology, the learning ability of students is expected to experience an increase in the teaching-
learning activities through comprehensive and organized. In other words, the goal is to increase the
ability of the Biology learning students especially in the field of biology. Related to the biology
learning achievement, it is required an assessment of learning biology well when ongoing learning and
after completion. On the other hand, Earl [7,8] explained that the assessment of the ongoing learning
can support the learning objectives achievement with the principle of assessment as learning.
Assessment is the Earl refers to types of formative assessment is applied to the learning process [9].
Through formative assessment, the teacher can collect data or related information about learning
process that are still running, and teachers also provide feedback to the students so that they are
motivated to maintain or improve the achievement of their study.
The teachers are suggested to give attention to the formative assessment as the means of
assessment as learning. One information in the learning process which can be obtained from formative
assessment to improve the biology learning achievement is the data about the student’s misconception.
Misconception is a barrier to understanding biology [10]. It must be healed by the teacher in the
learning process through the formative assessment. In the context of assessment, data or information
particularly a quantitative data is obtained through the measurement. To be able to measure the ability
of the students needed a measurement tool called the instrument. In this case, the ability of the
students may be referred to a variable which is generally a hidden variable (latent variables). Mardapi
[11] describes these variables can be measured based on the manifest indicator of the variable because
it is very difficult to get the data of the hidden variables directly. Bollen [12] added that "it is
impossible to date the first use of latent variables". Constructing manifest indicator of the variable
along with the development of the instrument not as easy as on physical variables. Therefore, the
assessment instruments need to be arranged according to the appropriate rules in order to get a quality
instrument and a good assessment data.
Azwar [13] imply in general the instrument of measurement is a tool that because certain criteria so
it can be used to measure an object's measure or collect the data of a variable. Subali [14] added that in
essence the instruments can be divided into two kinds, namely test, and non-test. Group tests such as
TOEFL, potential academic test, learning achievement tests, and tests of intelligence, while which are
included non-test, for example, the interviews guidelines, questionnaires, observation sheets, a
checklist, rating scale, the scale of assessments, and so on. In a narrow sense, and in particular in the
field of education, the research instrument can equate with assessment instruments to measure
students' ability.
Associated measurement/assessment instruments given the object being measured is the hidden
variable, emerging questions whether a measurement tool really measures what would and should be
measured as well as the extent to which such reliable measurement tool, really useful, and trustworthy.
These questions refer to the two principal things about the terms of a good instrument, namely the
validity and reliability. The quality of the research instrument greatly affects the accuracy of the
results of a research. Although the design of the study, scale data, and statistical tests are applied is in
compliance, to draw conclusions still depends on the quality of the research instrument. When the
research instrument has validity and reliability is low, then the conclusions of research or statistical
2
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
hypothesis test results become inappropriate. Subali [14] emphasize that the analysis of validity and
reliability are very important and need to be done in the development of an instrument.
Based on above description it can be said that the formative assessment of learning in Biology can
be applied to measure at the same time supporting learning objectives achievement. The assessment in
question can be carried out using instruments that were developed in accordance with the rules of
development of the instrument. Given the importance of the quality of the instrument to obtain data
accuracy assessment of students ' ability, then it needs to be done the investigation against these
qualities, one of them with the analysis of the validity of the content of the instrument. The research is
specifically focused on the analysis of the validity of the content as an early stage development
instrument assessment formative on biological subjects in the form of tests. Further, research is also
done in the comparisons of the technique of determining the validity of a test instrument development
related content of good quality. In addition, the results of this research are also expected to be a
reference to the reader to apply the technique of determination of the validity of the content in the
development of the instrument.
2. Method
The content validity in this study was determined by quantitative analysis of expert judgments and
qualitative expert reviews. Two quantitative approaches to content validity estimations (Lawshe’s CVR
[15] and Aiken’s V [16]) were compared in the analysis of a newly developed instrument which
consists of 35 items. The data obtained from a panel of five expert judges. Five experts are consist of
two experts of biology education (V1 and V2), one biologist (V3), one expert of learning assessment
(V4), and one practitioner; high school biology teacher (V5). The research instrument used in the form
of assessment sheet that contains a column of essentiality statements (with three options i.e. essential,
useful but not essential, and not useful) and score in five scales for each item accompanied by a
column for giving advice. This assessment sheet filled out by the experts of the related test items that
are being developed.
The statement essentiality and score for each item is used for the analysis of the validity of the
contents quantitatively using the formula Lawshe's CVR (the data from essentiality where the essential
items are getting 1 score) and Aiken's V (from the score of items), while suggestions/feedbacks from
expert analyzed qualitatively to the refinement of the instrument. For the purposes of interpretation,
the item is valid according to the CVR is the item with a value results of the analysis is greater than or
equal to 0 and the item is valid according to the V index is the item with a value of V index is greater
than or equal to 0.8. Table 1 containing the formula for calculating the validity of content according to
the CVR and V index.
Table 1. Formula calculation of the content validity quantitatively.
Lawshe’s CVR Aiken’s V
CVR = (2ne/n) – 1 V = ∑s / [n(c-1)]
[16]
3
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
3.1. Results
The stating and scoring results quantitatively by experts shows essentiality and the relevance of the
items on the formative test instruments being developed on high school biology subjects in form of
multiple choice for a reason. The essentiality statement and score given by five experts toward 35 test
items are presented in the Table 2.
Table 2. The results of the stating of essentiality and score of test items by five experts.
V1 V2 V3 V4 V5 Total
No. Item
Es.a Sc.b Es.a Sc.b Es.a Sc.b Es.a Sc.b Es.a Sc.b Es.a Sc.b
1 1 5 0 3 1 4 1 5 1 4 4 21
2 1 5 0 2 0 4 1 5 0 3 2 19
3 1 5 1 4 1 4 1 5 1 4 5 22
4 1 5 1 3 1 4 1 5 1 4 5 21
5 1 5 1 3 1 5 1 5 1 5 5 23
6 0 4 0 3 1 5 1 5 1 4 3 21
7 1 4 1 4 1 4 0 4 1 4 4 20
8 1 5 1 4 1 4 1 5 1 4 5 22
9 1 5 1 4 1 4 1 5 1 4 5 22
10 1 4 1 3 1 4 1 3 0 4 4 18
11 1 5 1 4 1 4 0 4 1 5 4 22
12 1 5 1 3 1 4 1 5 1 5 5 22
13 1 5 1 4 1 4 1 5 1 4 5 22
14 1 5 1 3 1 4 1 5 1 5 5 22
15 1 5 1 3 1 4 1 5 1 5 5 22
16 1 5 1 4 1 4 1 5 1 4 5 22
17 1 5 1 4 0 4 1 5 1 4 4 22
18 1 5 1 4 1 4 1 5 1 5 5 23
19 1 5 1 4 1 4 1 5 1 5 5 23
20 1 5 1 3 1 4 1 5 1 5 5 22
21 1 5 0 3 1 4 1 5 1 4 4 21
22 1 5 1 4 0 4 1 5 1 3 4 21
23 1 5 1 4 1 5 1 5 0 5 4 24
24 1 5 1 3 1 4 1 5 1 4 5 21
25 1 5 1 3 1 4 1 5 1 4 5 21
26 1 5 1 4 1 4 1 5 1 4 5 22
27 1 5 1 4 1 4 1 5 1 4 5 22
28 1 5 1 4 1 4 1 5 1 4 5 22
29 1 5 1 4 1 5 1 5 1 4 5 23
30 1 5 1 4 1 4 1 5 1 4 5 22
31 1 5 1 4 1 4 1 5 1 5 5 23
32 0 5 1 4 1 4 1 5 1 4 4 22
33 0 5 1 4 1 5 1 3 1 4 4 21
34 1 5 1 4 1 5 0 3 1 4 4 21
35 1 5 1 4 0 4 1 5 1 5 4 23
a
Es. = essentiality; bSc. = value of item relevance.
4
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
5
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
Table 4. Advices on the results of the qualitative review by experts toward the test instruments being
developed
Expert Advices
V1 Giving attention to the basic of blue-print construction. The biological material in the stem
of question need specification, for example, focus on the structure and function of living
things. The answers choice in item 10 is less homogeneous.
V2 In item 11, basic classification already mentioned, then better modified the construction of
the statement. Item 26 construction need to be changed to fit the indicator to "comparing".
V3 Some indicators need to be revised so that it becomes essential to be tested. If possible,
pictures can be clarified. The selection of words needs to be reviewed.
V4 Some verbs in the indicators need to be formulated precisely according to the aspects
measured. A choice answers in item 10 is less homogeneous. The negative statement needs
to be print underlined or in bold. Item 33 needs to be changed so that is match with the
indicators in construction.
V5 The indicator is high relevance with this aspect of the measure, but need the proper
statement in the stem of the question. Item 2, a branch of biology needs to be formulated
precisely. Item 6, the better its plant species mentioned. Item 7, need added emphasis that
the mention of the biodiversity levels in order. Item 11, the basic classification already
mentioned, then better modified the construction. Item 25, no need to mention about the
role of bacteria in the stem of the question. Item 29 has already mentioned the causal
relationship that can confuse students/testee filling the reason. Item number 26 and 33 need
to be changed in construction.
Qualitative results Summary: The advice of experts directing on the improvement of construction,
grammar and writing in general, and in particular for the revision of nine items i.e. item numbers 2, 6,
7, 10, 11, 25, 26, 29, and 33.
3.2. Results
One of the fundamental questions in an assessment is the extent to which the accuracy of the values
obtained with the ability possessed by the subject is assessed [17]. In other words, the question related
to the validity of the results of the assessment or in this case the corresponding validity of the
instruments used to assess. Assessment instruments need to be developed in accordance with the rules
of the development/construction of the instrument. On learning in schools, assessment instruments
could be developed in the form of varied ranging from the form of multiple choice tests to an
expanded field. The validity of the instrument is one of the terms of the construction of good quality
instruments, including an instrument tests. Therefore, the construction of the test requires an analysis
of the validity for the refinement of test items at once proves that the test results can be meaningful
and useful to the assessment as expected [18].
In the development of instruments, the first step of the test developer should do is investigate the
validity of the contents [19]. Hinkin [20] stated that the validity of the contents is required on the
development phase the scale of psychology. In this case, the development of psychological scale can
be equated with the development of the test. This can be done with consideration of experts to assess
the suitability of the item with the concept measured. Index of agreement or relevance is acceptable
should be determined before the test conducted trials. The validity of the analysis conducted in this
study is the analysis of the validity of the content that is part of the initial construction of formative
tests on biological subjects in high school. There are several ways to analyze the validity of the content
of the test instrument either quantitative or qualitative nature [21]. Many techniques of analysis of the
content validity quantitatively, two of which applying in the analysis are the Lawshe and Aiken
formula. Both types of analysis are the simple technique and easy to do to prove the validity of an
instrument with the consideration of the expert.
Based on the judgment of the experts who served on Table 2 then do calculations using formula
Lawshe (CVR) and Aiken (V) index gained validity in Table 3 where the overall index of 0.81 for a
6
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
test CVR and 0.84 for V index. This value indicates that the content validity of the instruments is in
the high category. From the results of the analysis of CVR, it appears that there is only one item that
stated invalid (item 2) and based on the results of the analysis V index on these items also showed
below of 0.8. Next, on the analysis according to the formula of Aiken, in addition to the item 2, there
are two other items entered an invalid category i.e. item number 7 and 10. In other words,
quantitatively, item 2, 7, and 10 needs to be revised so that such items be useful in tests. Revision of
item in question also need to consider the qualitative review results (Table 4) in the form of expert
advice toward related items. In item 2, more experts who stated that this item is not essential but still
useful in a test because the item is still quite relevant to aspects of the "exemplifies" on the scope of
biology (biology as a science), whereas the constructions of item 7 and 10 and its language need to be
fixed. Particularly, in item 10, also to note its homogeneity of the answer.
In addition to the three items above, further associated revision, Table 4 shows some of the items
that need to be the focus of the revision, that are item number 6, 11, 25, 26, 29, and 33. Revision or
improvement that emphasized by the experts is the writing in terms of grammar and construction of
test items. Both these aspects are also advised to note on the preparation of test items as a whole. The
revision instrument based analysis results support this content validity of proof of the validity of the
instrument test empirically next once done testing instruments. The validity of the content critical to
proved to support the accuracy of the measurement results on a test. However, Hinkin [22] mention
that some technique the determination of the validity of the content that has been applied by the
researchers haven't been able to ensures that the scale or test has valid content, but it will provide a
proof that the item is a reasonable of construction under testing and reduce the need for improvements
in the future.
Research results of Polit and Beck [23] getting a conclusion that clarity about the validation of the
content in the development of a test is very necessary. Validation of content give some
recommendations that can be used to improve the content of an instrument that will be used for
assessment or research in learning. Furthermore, Polit and Beck [23]. and Polit, et. al. [24] describes
that the validity of the contents shows the size of the deal about the relevance of the items among the
experts. This is in accordance with the formulations presented by Azwar (2017:134-135) i.e. the
validity of the formula according to Lawshe (CVR) and Aiken (V) formula. Furthermore, Wynd,
Schmidt, & Schaefer [25] added that the validity of the content of the instrument is often determined
through either an expert review qualitative or in quantitative agreement against reviewers.
The two quantitative approach to analyze the content validity applied in this analysis in accordance
with the results of the research Hinkin [22] that testing the validity of it can be done continue to -
continuously and not simply use one of the techniques in doing. From the data analysis the analysis, it
is known the distinction of results of CVR analysis there is only one item that is not valid while the
analysis on V index there are three items that are not valid. This difference is indeed often found in the
analysis of the content validity [26]. Further, if the results of the analysis of the content validity of
both formula are correlated to use Pearson correlation, the results obtained are not high enough (r =
0.514). With no negative correlation values, it can be said that both of these techniques are in line (not
against), but because the value is not high enough then for decision-making category valid and invalid
items should preferably be based on one calculation between the two. Although the test results
according to Lawshe's CVR and Aiken's V are slightly different, both techniques are still often
recommended by experts to analyze the validity of the contents quantitatively [16].
Based on the description of this, it can be stated that the content validity is important to be done as an
initial step in the test construction to prepares the good measurement. Study on the content validity is
still needed in further research [27], either qualitatively or quantitatively [28]. On the other hand, in
addition to the content validity, there are several other types of validity that are also important to note
such as factorial concurrent, construct, and predictive validity. Several types of validity can be further
analyzed through trials of the use of instruments in the field [16]. The results of the content validity
analysis on this study is an early stage in the construction of formative tests of biological subjects for
senior high school. The next stage, in order to complete the construction of the tests, will be carried
7
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
out revisions and testing tried out and doing the analysis of validity as well as the reliability of the test
empirically.
4. Conclusion
The content validity of a test instrument can be determined using either quantitative or qualitative
approach based expert judgment/review. Quantitatively, the content validity of the tests instrument can
be analyzed using the Lawshe (CVR) or Aiken (V) formula and can be done through qualitative expert
review of the content material, construction, and language. A Content Validity Ratio (Lawshe’s CVR)
initially determined that only one item lacked interrater proportion agreement about its essentiality to
the test instrument (CVR = - 0,2). The result of further content validity analysis shows three items had
low content validity coefficient (Aiken's V), that is indicating poor item relevance to the test. The
qualitative reviews suggest to give attention to questions stem language to nine items include three
items earlier. The findings supported the revision of nine items. Furthermore, related to the content
validity quantitative analysis, it is advised to use one technique/formula for the calculation so that it
can be more focused on doing the revision of instrument.
References
[1] Notar C E, Zuelke D C and Wilson J D 2013 The table of specifications: Insuring accountability
in teacher made tests J. of Instructional Psychology (2013) p 115
[2] Kinyua K and Okunya L O 2014 Validity and reliability of teacher-made tests: Case study of
year 11 physics in Nyahururu District of Kenya African Edu. Res. Journal 02(2014) p 61-71
[3] Subali B 2016 Prinsip Asesmen dan Evaluasi Pembelajaran Edisi Kedua (Yogyakarta: UNY
Press) p 1
[4] Campbell N A and Reece J B 2010 Biologi Jilid ke-1 (Edisi ke-8) Translated by Damaring Tyas
Wulandari (Jakarta: Erlangga) p 18
[5] Direktorat Pembinaan SMA Ditjen Pendidikan Menengah 2014 Pembelajaran Biologi melalui
Pendekatan Saintifik (Jakarta: Ministry of Education of Republik Indonesia).
[6] Carin A A and Sund R B 1989 Teaching Science Through Discovery (Columbus: Merrill
Publishing Company) p 4-5
[7] Earl L M 2013 Assessment For Learning; Assessment As Learning: Changing Practices Means
Changing Beliefs (Ontario Institute for Studies in Education, University of Toronto and The
University of Auckland) Issue 2
[8] Earl L M 2014 Assessment as Learning: Using Classroom Assessment to Maximize Student
Learning (Australia: Hawker Brownlow Education)
[9] Clark I 2008 Assessment is for learning: Formative assessment and positive learning
interactions Florida Journal of Educational Administration & Policy 02(2008)001 p 1-16
[10] Tekkaya C 2002 Misconceptions as Barrier to Understanding Biology (Ankara: Hacettepe
Universitesi Egitim Fakultesi Dergisi)
[11] Mardapi D 2017 Pengukuran, Penilaian, dan Evaluasi Pendidikan, Edisi Kedua (Yogyakarta:
Parama Publishing)
[12] Bollen A K 2002 Latent variable in psychology and social sciences Annual Reviews Psychology
(http://www.stat.cmu.edu)
[13] Azwar S 2015 Reliabilitas dan Validitas, Edisi Keempat (Yogyakarta: Pustaka Pelajar) p 2
[14] Subali B 2016 Pengembangan Tes: Beserta Penyelidikan Validitas dan Reliabititas secara
Empiris (Yogyakarta: UNY Press) p 46
[15] Lawshe C H 1975 A quantitative approach to content validity Personnel Psychology
28(1975)004 p 563-575
[16] Azwar S 2017 Penyusunan Skala Psikologi, Edisi Kedua (Yogyakarta: Pustaka Pelajar) p 134-
135
[17] Miller M D, Linn R L and Gronlund N E 2009 Measurement and Assessment in Teaching (New
Jersey: Pearson Education) p 70
8
ICRIEMS 5 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1097 (2018)
1234567890 ‘’“” 012039 doi:10.1088/1742-6596/1097/1/012039
[18] Yaghmaie F 2003 Content validity and its estimation Journal of Medical
Education 03(2003)001 p 25-27
[19] Rubio D M, Berg-Weger M, Tebb S S, Lee E S and Rauch S 2003 Objectifying content validity:
Conducting a content validity study in social work research Social Work Research
27(2003)002 p 94-104
[20] Hinkin T R 1995 A review of scale development practices in the study of organizatios Journal
of Management 21(1995))005 p 967-988
[21] Haynes S N Richard D and Kubany E S 1995 Content validity in psychological assessment: A
functional approach to concepts and methods Psychological Assessment 07(1995)003 p 238-
247
[22] Hinkin T R, Tracey J B and Enz C A 1997 Scale construction: developing reliable and valid
measurement instruments Journal of Hospitality & Tourism Research 21(1997)001 p 100-
120
[23] Polit D F and Beck C T 2006 The content validity index: Are you sure you know what’s being
reported? Critique and recommendations Research in Nursing & Health 29(2006)005 p 489-
497
[24] Polit D F, Beck C T and Owen S V 2007 Focus on research methods: Is the CVI an acceptable
indicator of content validity? Appraisal and recommendations Res Nurs Health 30(2007) p
459-467
[25] Wynd C A, Schmidt B and Schaefer M A 2003 Two quantitative approaches for estimating
content validity Western Journal of Nursing Research 25(2003)005 p 508-518
[26] Robins J A and Wiersema M S 2003 The measurement of corporate portfolio strategy: analysis
of the content validity of related diversification indexes Strategic Management Journal
24(2003)001 p 39-59
[27] Hendryadi H 2017 Validitas isi: tahap awal pengembangan kuesioner Jurnal Riset Manajemen
dan Bisnis (JRMB) Fakultas Ekonomi UNIAT 02(2017)002 p 169-178
[28] Winarti, Cari, Suparmi, Widha, Sunarno, and Istiyono E 2017 Development of two tier test to
assess conceptual understanding i heat and temperature. Journal of Physiscs Conf. Series
795(2017)012052
Acknowledgement
The authors say thanks to the experts who have been willing to give an evaluation to the test
instrument of biology subjects for senior high school that is being developed.