0% found this document useful (0 votes)
420 views

Bloom'S Taxonomy-Based Examination Question Paper Generation System

This document discusses the development of an automated examination question paper generation system (AQPGS) based on Bloom's Taxonomy. The system aims to replace the manual process of developing exam papers, which can be time-consuming for academics. The proposed AQPGS was developed using Visual Basic and connects to an MS Access database. It can generate multiple choice, true/false, and open-ended questions. An algorithm maps open-ended questions to different levels of Bloom's Taxonomy through keyword searching and random selection. The generated exam papers can be saved and edited. The system is intended to help academics efficiently create unbiased exam papers aligned with learning outcomes.

Uploaded by

Mardocheo Esleta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
420 views

Bloom'S Taxonomy-Based Examination Question Paper Generation System

This document discusses the development of an automated examination question paper generation system (AQPGS) based on Bloom's Taxonomy. The system aims to replace the manual process of developing exam papers, which can be time-consuming for academics. The proposed AQPGS was developed using Visual Basic and connects to an MS Access database. It can generate multiple choice, true/false, and open-ended questions. An algorithm maps open-ended questions to different levels of Bloom's Taxonomy through keyword searching and random selection. The generated exam papers can be saved and edited. The system is intended to help academics efficiently create unbiased exam papers aligned with learning outcomes.

Uploaded by

Mardocheo Esleta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

International Journal of Information System and Engineering

Vol. 6 (No.2), November, 2018


ISSN: 2289-7615
DOI: 10.24924/ijise/2018.11/v6.iss2/76.92 www.ftms.edu.my/journals/index.php/journals/ijise

This work is licensed under a


Creative Commons Attribution 4.0 International License.

Research Paper

BLOOM’S TAXONOMY-BASED EXAMINATION QUESTION PAPER


GENERATION SYSTEM

Yulia Timakova
FTMS College Malaysia
[email protected]

Kinn Abass Bakon


FTMS College Malaysia
[email protected]

Abstract

Assessment process is an essential activity in educational institutions to test performance of the


learners. The essence of examination papers is directly linked to evaluation of quality of the
graduates. Nevertheless, designing question papers is laborious task for the academics. This paper
is aimed to research and analyze current assessment process and build automated examination
question paper generation system (AQPGS) to replace manual method practiced by academics.
AQPGS prototype is intended to enable academics to produce quality examination papers on the
click, that are unbiased and aligned with learning outcomes, while saving the time and resources
in the assessment process. System prototype was developed in Visual Basic language and connects
to MS Access database. It includes MCQ, True/False and open-ended questions. Mapping algorithm
is integrated for automated categorization of open-ended questions according to Bloom’s
Taxonomy hierarchy, using keywords query and random selection of questions. Generated paper
can be saved into text document and edited.

Key Terms: Assessment System, Automation, Question Paper Generator, Bloom’s Taxonomy
System , Mapping Algorithm.

1. Introduction

As an education is a key to success, the examination process is a critical activity for educational
institutions to evaluate performance of learners. Content of the exam papers is the main criteria
to ensure the education quality level of the students brought out by the institutions.
Examination as well serves as a guide to students in their gradual journey to knowledge. That is
why proper examination paper compilation procedure is essential.

However, manual preparation of the examination papers may be very tedious, time-consuming
and challenging routine for academic staff if the institution does not practice computerized
method of compiling the materials. Therefore, the subject of this research is examination
process at FTMS College, aimed to evaluate the drawbacks of current manual method and
analyze assessment process in terms of discrepancy and resources use. The outcome of this
study shall be development of more efficient solution to enhance examination paper
preparation procedure and a recommendation on how to further improve assessment process.

This paper shall include the outline of the information gathered from the publications relating
to the area of research and interpretation of those facts into current study, analysis of data

ISSN: 2289-7615
Page 76
collected from users and the findings, system structure design and development of a prototype
of automatic question paper generation system based on Bloom’s Taxonomy. The proposed
system aims to provide an efficient alternative towards overcoming the issues associated with
manual process of assessment preparation.

2. Literature Review

2.1 Importance of quality assessment in education.

In the context of education, definition of assessment includes a number of procedures and


methods that instructors apply to measure, evaluate and record the academic preparedness,
learning progression, skill attainment, and academic needs of students. Thus instructors can
provide customized instructional guidance, lesson planning, or social services (The Glossary of
Education Reform, 2015).

According to Ewell (2008), since the past decade global tendency in higher education has drifted
away from the conventional teacher-centered approach which focused on the instructor's input
and assessment in terms of how well the students absorb the materials. Such assessment
method was considered too limited in evaluating learning, neglecting the nature of coherent
ability that is meant to integrate various individual skills into overall practice. Hence the
education trend has shifted towards student-centered perspective concentrating on the learning
outcomes, or what the learners are expected to be able to do at the end of the studying
experience. Furthermore, employers and educational strategists will be better be aware of the
graduates’ capabilities for employment and liability purposes (Ewell, 2008).

Learning outcomes denote the students’ accomplishments as an outcome from involvement in a


certain set of teaching and learning activities. The three classifications of learning outcomes of
students are: cognitive, affective and psychomotor (Bloom et al., 1956).

It is essential that teaching and learning experiences, learning outcomes and the assessment are
developed to comply with the three affiliated components (Biggs, 1996; Biggs, 1999; Biggs and
Tang, 2011). Combination of these elements will determine conformity and uniformity within
the syllabus where the expected learning outcomes align with the teaching, learning and
assessment processes in a logical and consistent way.

It is deduced that student-centered approach to the organization of educational processes


provides for better learning and more genuine student assessment. It has been also determined
that such method is particularly essential for education of Information Systems students
(Landry et al., 2008).

Indeed, constructing a broad and methodical assessment system is a tough task. Luckily there
are mechanisms and approaches specifically aimed to ease this work. However those
procedures cannot substitute expert judgment but to complement it. Generating a balanced
comprehensive assessment system requires right perception, supervision, cooperation,
communication and acceptance of change (Oliver, 2015).

2.2 Bloom’s Taxonomy in Assessment.

As pointed out by Veilleux (1999), academics often concentrate on material coverage and
consider an assessment complete if all main course topics are included in the exam. Coverage of
material only concerns the breadth of students’ knowledge, however of late an alternative
method is preferable to assess its depth. Assessment of knowledge depth can be organized in
accordance with Bloom’s taxonomy. Taxonomy-based exams measure the level of learners’
comprehension by including an organized set of questions, varying from easily resolved by a
learner who grasped basic material, to cases which require creative approach in applying
various techniques. Based on the difficulty level of the questions given, students’ papers are
marked according to fixed criteria, and not based on grade averages.
ISSN: 2289-7615
Page 77
One of the most problematic tasks of question paper planning is achieving a balance in multiple
question types which call for different levels of comprehension. Alternatively teachers can
compile diverse examinations involving questions with graded difficulty. If a teacher specifically
creates various understanding questions (short questions with a 40% grade allocation),
application questions (less straightforward questions with 40- 50% of grading scheme) and a
few analytical open-ended questions (another 10- 20% of marks) to make sure that students
who have acquired each level, can show their performance - then marking gets less complicated
and less disputable.

Veilleux (1999), further defined how academics benefit from applying Bloom’s Taxonomy to
ensure they are not missing out essential items out when compiling assessment and are as
follows:

(1) Frequently instructors find themselves perplexed by multiple standards and syllabus
requirements. While Bloom's taxonomy provides a guiding model for subdividing those
norms into approachable blocks that can be applied in making routine class plans and
also can be aligned with instructor’s own class objectives. Same as certain levels require
certain comprehensive delivery approaches, they also require specific assessment
techniques.

(2) Taxonomy can be utilized as an index to verify that all levels of domain are being
assessed and correspond the assessment tools with the relevant lessons and techniques.
Thereby, Bloom’s Taxonomy also helps educators to retain uniformity in assessment
practices, educational materials and reveal weak spots.

(3) Reference to the elements of taxonomy is a supportive tool for defining objectives and
monitoring how well the students understand material. Besides defining the objectives,
application of Bloom’s taxonomy is also extremely useful in assessing students’
comprehension of concepts. Referring to taxonomy levels and reviewing where the
students stand among those, allow instructor to move forward from elementary to more
sophisticated level of comprehension.

(4) A conclusive substantial benefit from assessment based on objectives is essentially


meaningful marks allocation, thus less disputable grading criteria – which eliminates
doubt among students regarding the grades given and there is no need for marking
adjustments from the educator’s side. It is an indicator of justified assessment
mechanism that can also be used to guide educators in adapting the level of directives
for new modules (Veilleux, 1999).

Vidakovic et al. (2004) opined that Bloom’s taxonomy has been proven to be a helpful guideline
structure for generating short answer, multiple choice, matching and essay questions which test
students’ knowledge in various cognitive exercises. The emphasis lies in classification of the test
item in a specific level of Bloom’s taxonomy depending on the highest level of cognitive problem
presented to the student.

2.3 Advantages of Using Information Systems in Assessment.

According to Sofield (2000), many developing countries have not fully utilized information
technology as a way of socioeconomic development. Although educational institutions are
progressively acknowledging significance of technology in education and examination practices,
in most of the institutions examination process is still handled manually. Manual procedure has
many drawbacks such as time consumption and resources wastage to purchase and store paper
records; it may cause errors, data redundancy and duplication of work if the same data could be
recorded by different examination board members; not communicating examination results
instantly and precisely etc. Workload complexity increases, multiplied by number of subjects
each instructor has to assess during academic session. By automating the assessment system
institutions can reduce human involvement by acquiring the technology since it promises
ISSN: 2289-7615
Page 78
concise storage, rapid data retrieval, tireless rigorous work of processing the information,
instant communication of information to users. The recent cases of rail transport
computerization and online banking are the prospect examples demonstrating the advantages
of using information technology. Thereby learning management systems as well can be
efficiently utilized for assessment purposes in higher education (Sofield, 2000).

Automated assessment management system dramatically decreases the amount of work and
time instructors spend on manual tasks. Using AMS teachers can generate question banks for
multiple subjects and upload supporting files and media in different formats. Among other
advantages of incorporating AMS in an institution is fulfilling requirement of employers and
certification agencies by generating digital evidence which they use to evaluate students’
performance.

Another benefit is quick turnaround of examination results, thus feedback for enhancement can
be executed instantly. AMS suggested for college use, provides structure for assessment data
generation and management, allows the institution to determine and align learning outcomes of
students, generate syllabus and assessment plans, record results, outline and track
improvement plans according to findings (Plantefaber and Wentland, 2015).

2.4 Systems comparison

2.4.1 Manual vs automated

Table 2.1. Manual vs. Automated Question Paper Generation System


Manual Automated
Prone to repetitions / duplications Random and unbiased generation
Slow due to human labor Speedy due to automation
Requires resources Requires only PC connectivity
Many steps in sorting questions based on Automated questions sorting based on
difficulty difficulty
Questions used are not stored in one place All questions are stored in database

Automated system will significantly lessen the efforts of an instructor, which allows generating
a question paper in a few clicks based on the requirements, such as marks and difficulty level of
questions. Shuffling algorithm ensures randomization in process of selecting questions from the
database hence preventing duplication of the questions.

2.4.2 Similar Systems’ Comparisons

Table 2.2. Comparison of proposed system with similar software


FreshLogics Quick School Add Proposed
Features Paper Quest Scholar men system
Builder
Integration of Bloom’s Taxonomy to No No No No Yes
determine question difficulty
Question bank database to store unlimited Yes Yes Yes Yes Yes
number of questions
Randomization of questions to eliminate Yes Yes Yes Yes Yes
repetitions
Print answer key Yes No Yes Yes No
User-defined access Yes Yes Yes Yes Yes
Scan input into text No Yes No Yes No
Save questions with images and equations Yes Yes Yes Yes No
LAN support Yes Yes No No No
Spell-check No Yes No No No
Automatic updates No Yes No No No
Multiple languages No No Yes Yes No
ISSN: 2289-7615
Page 79
Ability to generate any type of questions No No Yes Yes No
(MCQ, fill in blanks, short / long answer etc.)
Questions grading (by marks / difficulty / No No Yes Yes Yes
importance)
Multiple output format for generated paper No No Yes No No

Even though proposed system lacks some useful functions compared to other software, it has
one unique and essential feature for question paper generation - which is Bloom’s Taxonomy
integration for addressing questions’ difficulty level. This particular function is crucial for the
academics to build their question papers based on the taxonomy hierarchy.

3. Research Design and Methodology

For purpose of this research, quantitative method of data collection in a form of survey or
questionnaire was selected due to being time-efficient while allowing to gather data from larger
number of respondents from selected segment.

The objectives of this survey are to reveal the issues with current manual method of designing
question papers, identify needs and requirements of users and conduct further analysis of
collected data.

3.1 Questionnaire design

Questionnaire was constructed using MCQ to establish particular facts; as well as questions
based on Likert scale system, where answer’s likelihood ranges from least to most. Analysis of
such answers would indicate number of users who agreed or disagreed upon statements in
question.

Questionnaire consisted of four sections:


-Section 1: greeting respondents and introducing purpose of study;
-Section 2: demographic profile;
-Section 3: questions concerning current system;
-Section 4: questions concerning proposed system.

3.2 Survey administration

Google Forms was used as the questionnaire design tool, which built-in instruments allow for
easy distribution, interpretation and analysis.

Intended audience for gathering data were the FTMS academics as primary users of the system.
Questionnaire was distributed to twenty five staff including lecturers and Heads of Schools via
e-mail and total of twelve responses were collected. Majority of 75% respondents were male
and 25% - female. Such ratio is not intentional to target male category, as the survey is taken at
random and the feedback was received only from three females of all questionnaire recipients.

4. Results and Discussion

4.1 Results analysis.

Analysis of selected questions is presented below. The questions selection is based on degree of
relevance of the answers to the main problem.

ISSN: 2289-7615
Page 80
Q1.

Figure 1. Level of busyness of the users.

Assumingly due to wide range of responsibilities, more than 90% of respondents agreed that
they have massive work load. This illustration means that the lecturers, who answered the
survey, are responsible for tasks such as preparing course materials, lecturing for 15 hours/
week, counselling, manually creating assessment questions, invigilating and supervising
students. The result above means that the staffs are usually much occupied and automation of
some of their tasks would greatly assist them in carrying out their duties efficiently.

Q2.

Figure 2. Level of satisfaction of the users with current method.

About 60% of lecturers are not satisfied with the manual method of designing examination
assessment questions. As could be seen from the figure 2, more than half of the respondents are
found to be discontented with the manual method of question paper generation as that method
is time consuming and affects their ability to do other tasks, considering their work load. This
also indicates the need for alternative ways to create the questions.

Q3.

ISSN: 2289-7615
Page 81
Figure 3. Issues with current method.

When asked to rate the major issues the respondents have concerning the current manual
method of examination assessment, 100% of the respondents rated ‘time-consuming’ as the
major issue that they experience, followed by the probability of making a mistake (‘prone to
error’) which stands at 66.7%. The third issue rated by the respondent is the uneasiness of
accessing good resources which stands at 58.3% and last but not the least, wastage of resources
was rated fourth by 50% of respondents among the major issues the lecturers have with
manual method of question paper creation.

Q4.

ISSN: 2289-7615
Page 82
Figure 4. Time it takes users to generate one question paper.

Average time to design question paper for one subject takes up to five hours. Bearing in mind
that each lecturer has to teach between 4 to 6 subjects. Hence, developing examination papers
for each subjects is time consuming.

Q5.

Figure 5. Guide used for questions set-up.

100% of the lecturers are referring to Bloom’s Taxonomy as a guide for setting up examination
questions.

ISSN: 2289-7615
Page 83
Q6.

Figure 6. Proneness to errors

More than 90% of respondents agree that even after review and approval of question papers,
there are still errors usually found with the final draft. These errors could be caused by
exhaustion on the part of the setters using manual method.

Q7.

Figure 7. Likelihood of users using the proposed system

100% of respondents indicated their willingness to use automated system, there is no single
negative answer to that. As they perceive the proposed system to be beneficial and could
facilitate their ability to create the papers speedier.

Q8.

ISSN: 2289-7615
Page 84
Figure 8. Usefulness of proposed system

100% of respondents find automated system more efficient and productive compared to
manual.

4.2 Discussion of Findings.

From the questionnaire data analysis the following findings were identified: (1) majority of
respondents are lecturers of SOECS with five to nine years of working experience, having four to
seven subjects under their responsibility and are advanced computer users; (2) majority of
respondents admitted having a massive work load due to wide scope of responsibilities besides
lecturing and preparing materials, such as invigilation, MQA documentation, project
supervision, managerial or marketing activities etc.; (3) majority of respondents are not
satisfied with current manual method due to it being slow and time-consuming, prone to human
errors even after exam board review, leads to wastage of resources and not being environment-
friendly; (4) more than half of the respondents admitted that sometimes they fail to meet the
submission deadline of the question papers, as it takes up to five hours to design question paper
for one subject using lecture slides, books, online sources and categorizing questions according
to Bloom’s Taxonomy; (5) majority of respondents indicated that they would highly likely use
automated system, as they find it more efficient and useful due to processes automation and
added features such as user security via login authentication, ability to save questions in
database and save generated papers as soft copy.

4.3 Proposed System Design.

Data flow diagram is used to depict flow of data inputs and outputs in the AQPGS through its
processes and where the data is stored. Symbols used are according to Gane and Sarson
notation.

ISSN: 2289-7615
Page 85
Figure 9. AQPGS DFD Level 1

From this graphical representation we can see the functions AQPGS prototype supports. For
Admin: to Add, Modify, Delete users and store their information in database; for Lecturer: to
Add three types of Questions and save them in Question bank database; Generate question
paper and Export the result. System returns generated paper and exports it into user’s
computer.

5. Proposed system implementation

5.1 Interface designs and main functions.

ISSN: 2289-7615
Page 86
Figure 10. Homepage interface and Login window.

Before user is able to access system, it is necessary to pass the authentication step through login
form where user must enter the correct username and password given by system administrator.
If user has administrator role, login form will redirect to Admin Portal. If login details match a
lecturer role, user will be redirected to Lecturer Portal.

Figure 11. Admin Portal interface.

Once user entered username and password which belong to admin role, Admin Portal form is
shown, from which Admin can: add new users, modify user information and delete users.

ISSN: 2289-7615
Page 87
Figure 12. Admin Portal: Add/Modify/Delete User form.

If authentication as a lecturer role is successful, user accesses Lecturer Portal.

Figure 13. Lecturer Portal interface.

Here Lecturer can do two main tasks:

(1) “Add question”: lecturer can key in question together with answers and save the content
into question bank database. To switch between question types, Lecturer has to close
current question form and go back to Lecturer Portal form.

ISSN: 2289-7615
Page 88
Figure 14. Lecturer Portal: Add Question forms – MCQ, True/False, Open-Ended.

(2) “Generate Paper”: Lecturer should choose the subject, then click
“Generate” and the created paper will be displayed in the form.
AQPGS locates all the questions under respective subject and
question types and then generates specified number of questions
for each section. Currently the design includes: 10 multiple choice
and 10 true/false questions under section A, 3 questions under
section B and 2 questions under section C.

Figure 15. Generated question paper.

AQPG system retrieves questions from question bank using randomization algorithm based on
question ID for all three sections. However, in sections B and C mapping function based on
keywords is added to algorithm. Every word in questions’ content is compared against specified
Bloom’s Taxonomy categorization verbs acting as query keywords. If any question contains a
verb matched as the specified keyword, it is placed in the respective section based on
complexity level, where section C questions are most complex ones. This process is performed
until the specified set of questions is retrieved and final resultant string is sent to the
application form together with its respective question. Upon clicking “Export” button, Lecturer
is able to download generated question paper with all its details (answer options and
instructions) in MS Word format or click “Cancel” to abort generation.

6. Conclusion and future enhancement

6.1. Conclusion.

ISSN: 2289-7615
Page 89
This study involved overview of the assessment process and Bloom’s Taxonomy; discussed
advantages of information systems in assessment and several assessment generation systems
were evaluated. As well, survey was conducted to collect data from FTMS academics to
determine the issues and develop more efficient alternative. Automated system prototype was
developed as a basic desktop application in Visual Studio environment with functions such as
secure login, question bank and generation algorithm to serve the main purpose of speedy
design of question papers aligned with learning outcomes based on Bloom’s Taxonomy
hierarchy.

In regards with to problems identified with current manual system, automatic system prototype
provides following solutions: (1) Significantly reduces human involvement by processes
automation; (2) Helps retain resources by performing tireless rigorous work of information
processing, hence instructors can focus more on academic part rather than devote much time to
question papers design; (3) System allows for rapid data retrieval and manipulations to
generate output on the click with minimum effort; (4) Robust algorithm provides unbiased
results due to random selection of questions and eliminates duplications; (5) System offers
concise storage of question items, question bank can cover wide range of subjects and question
types.

The researchers hereby conclude that the proposed solution was helpful as alternative to
present practice. However to be fully functional, it may need improvement, as current prototype
has the following constraints: (1) System must be given accurate inputs, otherwise it may
produce incorrect results; (2) User has to format the question paper after it’s prepared; (3)
Lecturer has to manually update information by entering new questions into database; (4)
Admin has to change paper structure every time lecturer wants to change paper format.

6.2. Future enhancement.

Current project is basic application prototype to demonstrate the idea and prove usefulness of
automatic system through development of mapping and randomization algorithm to facilitate
human labor and eliminate errors in question paper generation process. In order to further
extend its functionality, following enhancements may be done: (1) System to be re-designed
into web-based application using PHP programming, Javascript and SQL server database, to be
more interactive, secure and provide opportunity to display previous question papers online for
students’ reference and online quizzes; (2) Function to scan input and convert it into text to
save multiple questions at a time; (3) Integrated spelling and grammar check to eliminate any
human errors in input; (4) Check for any repeated questions to avoid duplications; (5) Multi-
language support; (6) Integrated compliance with MQA; (7) Automatic paper grading based on
given criteria; (8) Option to change paper format by Lecturer without reference to Admin; (9)
More output format options such as MS Excel.

References

[1] Anderson, L. W., Krathwohl, D. R., Bloom, B. S. (2001). A taxonomy for learning, teaching,
and assessing: A revision of Bloom's taxonomy of educational objectives. [Online]. Available at:
http://cmapspublic2.ihmc.us/rid=1Q2PTM7HL-26LTFBX-9YN8/Krathwohl%202002.pdf
[Accessed 20 March 2018].

[2] Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education.
[Online], Vol. 32 No. 3, pp. 347-364. Available at:
https://link.springer.com/article/10.1007/BF00138871 [Accessed 15 March 2018].

[3] Biggs, J. (1999). Teaching for Quality Learning at University. Society for Research in
Higher Education and Open University Press, Buckingham. [Online]. Available at:
https://03hoezb55jq12.storage.googleapis.com/EgakKnjEN8nakamTWT12.pdf [Accessed 15
March 2018].

ISSN: 2289-7615
Page 90
[4] Biggs, J. and Tang, C. (2011), Teaching for Quality Learning at University, 4th ed., Society
for Research in Higher Education and Open University Press, Berkshire. [Online]. Available at:
https://books.google.com.my/books?hl=en&lr=&id=VC1FBgAAQBAJ&oi=fnd&pg=PP1&dq=+Big
gs,+J.+and+Tang,+C.+(2007),+Teaching+for+Quality+Learning+at+University,+3rd+ed.,+Society
+for+Research+in+Higher+Education+and+Open+University+Press,+Berkshire.&ots=E7zJlGdIP
p&sig=Hkm8H0Qwdpv4CZ1myiwPzJ_L11s&redir_esc=y#v=onepage&q&f=false [Accessed 20
March 2018].

[5] Ewell, P. (2008). Building academic cultures of evidence: a perspective on learning


outcomes in higher education. Symposium of the Hong Kong University Grants Committee on
Quality Education, Quality Outcomes – the way forward for Hong Kong, Hong Kong. [Online].
Available at: www.ugc.edu.hk/eng/ugc/activity/outcomes/symposium/2008/present.html
[Accessed 12 March 2018].

[6] Naik, K. et al. (2014). Automatic Question Paper Generation System using
Randomization Algorithm. International Journal of Engineering and Technical Research (IJETR),
[Online]. 2(12), 192-194. Available at:
http://www.academia.edu/10336965/Automatic_Question_Paper_Generation_System_using_R
andomization_Algorithm [Accessed 5 February 2018].

[7] Oliver, E., 2015. Alternative assessment for effective open distance education. University
of South Africa, [Online]. Available at:
http://uir.unisa.ac.za/bitstream/handle/10500/20010/dissertation_oliver_e.pdf?sequence=1&
isAllowed=y [Accessed 20 March 2018].

[8] Plantefaber L., Wentland E. (2013). Evaluating Assessment Management Systems: Using
Evidence from Practice. Westfield State University. [Online]. Available at:
http://www.iaea.info/documents/paper_5bc19b6e.pdf [Accessed 07 April 2018].

[9] Saulnier, B.M., Landry, J.P., Wagner, T.A. 2008. From Teaching to Learning: Learner-
Centered Teaching and Assessment in Information Systems Education. Journal of Information
Systems Education, [Online]. 19 (2), 169-175. Available at:
http://jise.org/volume19/n2/JISEv19n2p169.pdf [Accessed 20 March 2018].

[10] Sharma D. et al. (2017). Automatic Question Paper Generator. Artificial Intelligence
Project Report 2017. Delhi University. [Online]. Available at:
https://github.com/DEEZZU/Automatic-Question-Paper-Generator-
App/blob/22832666934e523d30603b8fb3249548f206dc83/AI%20REPORT_AQPG.pdf
[Accessed 25 May 2018].

[11] Sofield, T. H. B. (2000). Cultural Attitudes Towards Technology and Communication.


Perth Second International Conference. [Online], 3-26. Available at:
http://www.hpuniv.nic.in/Journal/Jul_2011_Mohini%20and%20A%20J%20%20Singh.pdf
[Accessed 09 April 2018].

[12] Tam M. (2014). Outcomes-based approach to quality assessment and curriculum


improvement in higher education. Quality Assurance in Education, [Online]. 22(2), 158-168.
Available at: https://doi.org/10.1108/QAE-09-2011-0059 [Accessed 12 March 2018].

[13] The Glossary of Education Reform (2015). Assessment. [Online] Available at:
https://www.edglossary.org/assessment/. [Accessed 03 March 2018].

[14] Umardand, A., Gaikwad A. (2017). A Survey on Automatic Question Paper Generation
System. International Advanced Research Journal in Science, Engineering and Technology
(IARJSET), [Online]. 4(4), 18-20. Available at:
http://www.iarjset.com/upload/2017/si/NCIARCSE-2017/IARJSET-NCIARCSE%206.pdf
[Accessed 11 February 2018].

ISSN: 2289-7615
Page 91
[15] Veilleux N. (1999). Assessment Tools based on Bloom’s Taxonomy of Educational
Objectives. Metropolitan College of Boston University. [Online]. Available at:
https://peer.asee.org/assessment-tools-based-on-bloom-s-taxonomy-of-educational-
objectives.pdf [Accessed 27 March 2018].

[16] Vidakovic D. et al. (2004). Bloom's Taxonomy in Developing Assessment Items -


Discussion, Teaching Implications, and Conclusion. Convergence | Mathematical Association of
America. [Online] Available at: https://www.maa.org/press/periodicals/loci/joma/blooms-
taxonomy-in-developing-assessment-items-discussion-teaching-implications-and-conclusion.
[Accessed 27 March 2018].

IJISE is a FTMS Publishing Journal

ISSN: 2289-7615
Page 92

You might also like