Cs3491 Aiml Question Bank
Cs3491 Aiml Question Bank
QUESTION BANK
R-2021
Prepared by
Mrs. M. Ambika, AP/CSE Page 1
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
II YEAR / IV SEMESTER
REGULATION 2021
Prepared by
Mrs. M. Ambika, AP/CSE Page 2
INSTITUTION VISION & MISION
VISION:
MISION:
VISION:
To provide the quality education in Computer Science and Engineering and to mould the
students into self-confident and professionally competent individuals.
MISION:
M1: To produce successful graduates enriched with professional and leadership capabilities.
M2:To impart the skills necessary to continue education and grow Professionally.
M5: To contribute towards empowering the rural youth with computer education.
Prepared by
Mrs. M. Ambika, AP/CSE Page 3
PROGRAM EDUCATIONAL OBJECTIVES (PEOs)
Apply their technical competence in computer science to solve real world problems,
PEO1
with technical and people leadership.
Conduct cutting edge research and develop solutions on problems of social
PEO2
relevance.
Work in a business environment, exhibiting team skills, work ethics, adaptability
PEO3
and lifelong learning.
Exhibit design and programming skills to build and automate business solutions
PSO1
using cutting edge technologies.
Strong theoretical foundation leading to excellence and excitement towards
PSO2
research, to provide elegant solutions to complex problems.
Ability to work effectively with various engineering fields as a team to design, build
PSO3
and develop system applications.
Prepared by
Mrs. M. Ambika, AP/CSE Page 5
CS3491 ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING LTPC
3024
COURSE OBJECTIVES:
The main objectives of this course are to:
Study about uninformed and Heuristic search techniques.
Learn techniques for reasoning under uncertainty
Introduce Machine Learning and supervised learning algorithms
Study about ensembling and unsupervised learning algorithms
Learn the basics of deep learning using neural networks
UNIT I PROBLEM SOLVING 9
Introduction to AI - AI Applications - Problem solving agents – search algorithms – uninformed
search strategies – Heuristic search strategies – Local search and optimization problems –
adversarial search – constraint satisfaction problems (CSP)
UNIT II PROBABILISTIC REASONING 9
Acting under uncertainty – Bayesian inference – naïve bayes models. Probabilistic reasoning –
Bayesian networks – exact inference in BN – approximate inference in BN – causal networks.
UNIT III SUPERVISED LEARNING 9
Introduction to machine learning – Linear Regression Models: Least squares, single & multiple
variables, Bayesian linear regression, gradient descent, Linear Classification Models:
Discriminant function – Probabilistic discriminative model - Logistic regression, Probabilistic
generative model – Naive Bayes, Maximum margin classifier – Support vector machine,
Decision Tree, Random forests
UNIT IV ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9
Combining multiple learners: Model combination schemes, Voting, Ensemble Learning -
bagging, boosting, stacking, Unsupervised learning: K-means, Instance Based Learning: KNN,
Gaussian mixture models and Expectation maximization
UNIT V NEURAL NETWORKS 9
Perceptron - Multilayer perceptron, activation functions, network training – gradient descent
optimization – stochastic gradient descent, error backpropagation, from shallow networks to
deep networks –Unit saturation (aka the vanishing gradient problem) – ReLU, hyperparameter
tuning, batch normalization, regularization, dropout
TOTAL:
45 PERIODS
Prepared by
Mrs. M. Ambika, AP/CSE Page 6
COURSE OUTCOMES:
At the end of this course, the students will be able to:
2 3 2 2 3 1 3 2 - - - - 1 3 3 3
3 1 2 1 3 2 3 2 - - - - 1 3 3 3
4 1 2 3 1 3 3 2 - - - - 1 3 3 3
5 2 2 2 - 3 3 2 - - - - 1 3 3 3
Avg 2 2 2 2 2 3 2 - - - - 1 3 3 3
Text Books
1. Stuart Russell and Peter Norvig, “Artificial Intelligence – A Modern Approach”, Fourth
Edition, Pearson Education, 2021.
2. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.
Reference Books
1. Dan W. Patterson, “Introduction to Artificial Intelligence and Expert Systems”, Pearson
Education,200
2. Kevin Night, Elaine Rich, and Nair B., “Artificial Intelligence”, McGraw Hill, 2008
3. Patrick H. Winston, "Artificial Intelligence", Third Edition, Pearson Education, 2006
4. Deepak Khemani, “Artificial Intelligence”, Tata McGraw Hill Education, 2013
(http://nptel.ac.in/)
Prepared by
Mrs. M. Ambika, AP/CSE Page 7
5. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
6. Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition,1997.
7. Charu C. Aggarwal, “Data Classification Algorithms and Applications”, CRC Press,
2014
8. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine
Learning”, MIT Press, 2012.
9. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016
Prepared by
Mrs. M. Ambika, AP/CSE Page 8
UNIT I
PROBLEM SOLVING
PART -A
CO
Q.
Questions Mappin BT Level Complexity
No
g
1 Define artificial intelligence. CO1 Remember Low
2 What is adversarial search? CO1 Understand Low
3 List the characteristics of AI. CO1 Remember Low
4 What are agents for AI and software doing? CO1 Understand Low
How will you measure the performance of AI
5 CO1 Understand Low
application?
Differentiate between uninformed and
6 CO1 Understand Low
informed search algorithms
List the steps involved in simple problem
7 CO1 Remember Low
solving agent
8 Give example for real world end toy problems. CO1 Remember Low
State on which basis search algorithms are
9 CO1 Remember Low
chosen?
Evaluate performance of problem-solving
10 CO1 Evaluate High
method based on depth-first search algorithm?
11 Why does one go for heuristics search? CO1 Understand Low
When a heuristic function h is said to be
12 admissible? Give an admissible heuristic CO1 Understand Low
function for TSP?
How can we avoid ridge and plateau in hill
13 CO1 Understand Low
climbing?
How can minimax also be extended for game
14 CO1 Understand Low
of chance?
PART B
CO
Q. Complexit
Questions Mappin BT Level
No y
g
1 (i) Explain any two Informed Search CO1 Understand Medium
Prepared by
Mrs. M. Ambika, AP/CSE Page 9
Strategies
(ii) Explain the heuristic functions with
examples.
Explain the following types of hill climbing
search techniques.
2 (i) Simple hill climbing CO1 Understand Medium
(ii) Steepest- Ascent hill climbing
(iii) Simulated Annealing
3 Discuss AO* algorithm in detail with example. CO1 Understand Medium
4 Describe Minmax algorithm in detail CO1 Understand Medium
Solve crypt arithmetic problem for the below
problem
SEND
+MORE
5 CO1 Create High
----------------------------
MONEY
No two letters have the same value. The sums
of the digits must be shown in the problem.
Evaluate Alpha-Beta pruning and Alpha-Beta
algorithm
Prepared by
Mrs. M. Ambika, AP/CSE Page 10
UNIT II
PROBABILISTIC REASONING
PART -A
CO
Q.
Questions Mappin BT Level Complexity
No
g
1 Define Uncertainty. CO2 Remember Low
2 State Bayes’ rule. CO2 Remember Low
3 Differentiate logical and probabilistic assertions. CO2 Understand Low
4 Why a hybrid Bayesian network is called as such? CO2 Understand Low
Mention the needs of probabilistic reasoning in
5 CO2 Remember Low
AI.
Given that P(A)=0.3, P(A/B)=0.4 and P(B)=0.5,
6 CO2 Create High
Compute P(B/A).
Define principle of maximum expected utility
7 CO2 Remember Low
(MEU)?
What does the full joint probability distribution
8 CO2 Understand Low
specify?
What is the difference between stochastic gradient
9 CO2 Understand Low
descent (SGD) and gradient descent (GD)?
10 Why does uncertainty arise? CO2 Understand Low
11 What is the need for utility theory in uncertainty? CO2 Understand Low
PART B
CO
Q.
Questions Mappin BT Level Complexity
No
g
(i) Elaborate on unconditional probability and
conditional probability with an example.
Understan
1 (ii) What is a Bayesian network? Explain the steps CO2 Medium
d
followed to construct a Bayesian network with
an example.
Prepared by
Mrs. M. Ambika, AP/CSE Page 11
What do you mean by inference in Bayesian
Understan
2 networks? Outline inference by enumeration with CO2 Medium
d
an example.
Demonstrate the use of Bayes’ rule with an
example in a doctor finding the probability
3 CO2 Create High
P(disease /Symptoms) before and after the
decease becomes epidemic.
Briefly explain about how the sustainability of Understan
4 CO2 Medium
enumeration algorithm can be improved. d
Consider the following set of propositions:
Patient has spots
Patient has measles
Patient has high fever
The patient has Rocky Mountain spotted
fever.
The patient has previously been inoculated
5 CO2 Create High
against measles.
Patient was recently bitten by a tick
Patient has an allergy.
(i) Create a network that defines the casual
connections among these nodes
(ii) Make it a Bayesian network by constructing
the necessary conditional probability matrix.
Construct a Bayesian Network and define the
necessary CPTs for the given scenario. We have a
bag of tree biased coins a, b and c with
probabilities of coming up heads of 20%, 60%
and 80%, respectively. One coin is drawn
randomly from the bag ( with equal likelihood of
6 drawing each of the three coins) and then the coin CO2 Create High
is flipped three times to generate the outcomes
X1, X2, and X3.
(i) Draw a Bayesian network corresponding to
this setup and define the relevant CPTs.
(ii) Calculate which coin is most likely to have
been drawn if the flips come up HHT.
Define uncertain knowledge, prior probability and
conditional probability. State the Bayes’ theorem. Understan
7 CO2 Medium
How it is useful for decision making under d
uncertainty? Explain belief networks briefly?
Prepared by
Mrs. M. Ambika, AP/CSE Page 12
How to get the exact inference form Bayesian Understan
8 CO2 Medium
network? d
How to get the approximate inference from Understan
9 CO2 Medium
Bayesian network? d
Understan
10 Explain Naïve Bayes Classifier with an Example CO2 Medium
d
Explain variable elimination algorithm for Understan
11 CO2 Medium
answering queries on Bayesian networks? d
12 Harry installed a new burglar alarm at his home to CO2 Create High
detect burglary. The alarm reliably responds at
detecting a burglary but also responds for minor
earthquakes. Harry has two neighbors David and
Sophia, who have taken a responsibility to inform
Harry at work when they hear the alarm. David
always calls Harry when he hears the alarm, but
sometimes he got confused with the phone ringing
and calls at that time too. On the other hand,
Sophia likes to listen to high music, so sometimes
she misses to hear the alarm. Here we would like
to compute the probability of Burglary Alarm.
Calculate the probability that alarm has sounded,
but there is neither a burglary, nor an earthquake
occurred, and David and Sophia both called the
Harry.
Prepared by
Mrs. M. Ambika, AP/CSE Page 13
The following table gives data set about stolen
vehicles. Using Naive Bayes classifier classify, the
new data {Color: Red, Type: SUV, Origin:
Domestic}
Prepared by
Mrs. M. Ambika, AP/CSE Page 14
UNIT III
SUPERVISED LEARNING
PART –A
CO
Q.
Questions Mappin BT Level Complexity
No
g
Outline the difference between supervised
1 CO3 Understand Low
learning and unsupervised learning.
2 What is a random forest? CO3 Understand Low
3 What is the niche of machine learning? CO3 Understand Low
4 State the logic behind Gaussian processes. CO3 Remember Low
5 How can overfitting be avoided? CO3 Understand Low
6 Assume a disease so rare that it is seen in only CO3 Create High
one person out of every million. Also assume that
we have a test that is effective in that if a person
Prepared by
Mrs. M. Ambika, AP/CSE Page 15
has the disease, there is a 99 percent chance that
the test result will be positive; however, the test is
not perfect, and there is a one in a thousand
chance that the test result will be positive on a
healthy person. Assume that a new patient arrives
and the test result is positive. What is the
probability that the patient has the disease?
7 What is training set and test set? CO3 Understand Low
Compare and Contrast supervised learning and
8 CO3 Evaluate High
unsupervised learning.
Is linear discriminant analysis classification or
9 CO3 Understand Low
regression?
What Is Pruning in Decision Trees, and How Is It
10 CO3 Understand Low
Done?
You’ve built a random forest model with 10000
trees. You got delighted after getting training
11 error as 0.00. But, the validation error is 34.23. CO3 Create High
What is going on? Haven’t you trained your
model perfectly?
Is linear discriminant analysis classification or
12 CO3 Understand Low
regression?
PART B
Q. CO
N Questions Mappin BT Level Complexity
o g
Elaborate on logistics regression with an
1 example. Explain the process of computing CO3 Understand Medium
coefficients.
What is a classification tree? Explain the steps
2 to construct a classification tree. List and CO3 Understand Medium
explain about the different procedures used.
Describe the general procedure of random forest
3 CO3 Understand Medium
algorithm.
With a suitable example explain knowledge
4 CO3 Understand Medium
extraction in detail.
5 State when and why you would use random CO3 Understand Medium
Prepared by
Mrs. M. Ambika, AP/CSE Page 16
forests vs SVM?
Explain the principles of the gradient descent
6 algorithm. Accompany your explanation with a CO3 Understand Medium
diagram.
What is a classification tree? Explain the steps
to construct a classification tree. List and
explain about the different procedures used.
You have to examine the relationship between
the age and price for used cars sold in the last
year by a car dealership company. Here is the
table of the data:
Prepared by
Mrs. M. Ambika, AP/CSE Page 17
UNIT IV
PART -A
CO
Q.
Questions Mappin BT Level Complexity
No
g
1 Define ensemble Learning. CO4 Remember Low
What is the significance of Gaussian mixture
2 CO4 Understand Low
model?
3 When does an algorithm become unstable? CO4 Understand Low
Why is the smoothing parameter h need to be CO4
4 Understand Low
optimal?
5 Write the three types of ensemble learning. CO4 Understand Low
6 How expectation maximization is used in CO4 Understand Low
Prepared by
Mrs. M. Ambika, AP/CSE Page 18
Gaussian mixture models?
7 Compare K means and Gaussian mixture? CO4 Evaluate High
What are Gaussian mixture models? How is CO4
8 Understand Low
expectation maximization used in it?
9 State the principle of maximum likelihood? CO4 Remember Low
How do you implement expectation maximization CO4
10 Understand Low
algorithm?
PART B
CO
Q.
Questions Mappin BT Level Complexity
No
g
(i) What is bagging and boosting? Give
example.
1 CO4 Understand Medium
(ii) Outline the steps in the AdaBoost algorithm
with an example.
Elaborate on the steps in expectation-
2 CO4 Understand Medium
maximization algorithm.
Assume an image has pixel size 240X180.
3 Elaborate how K means clustering can be used to CO4 Apply Medium
achieve lossy data compression of that image.
Explain in detail about combine multiple
4 CO4 Understand Medium
classifiers by voting.
Discuss various learning techniques involved in
5 CO4 Understand Medium
unsupervised learning.
List the applications of clustering and identify the
6 advantages and disadvantages of clustering CO4 Understand Medium
algorithms.
What is a Gaussian process? And explain in detail
7 of Gaussian parameter estimates with suitable CO4 Understand Medium
examples.
8 Explain details about KNN algorithm? CO4 Understand Medium
Describe briefly about unsupervised learning
9 CO4 Understand Medium
structure?
10 Consider five points {x1, x2, x3, x4, x5} with the CO4 Analyze Medium
following coordinates as a two dimensional sample
for clustering:
X1= (0.5, 1.75), x2= (1, 2), x3= (1.75, 0.25),
x4= (4, 1), x5= (6,3).
Prepared by
Mrs. M. Ambika, AP/CSE Page 19
Illustrate the k-means algorithm on the above data
set. The required number of clusters is two and
initially, clusters are formed from random
distribution of samples: C1={x1, x2, x4} and
C2={x3, x5}
UNIT V
NEURAL NETWORKS
PART -A
CO
Q.
Questions Mappin BT Level Complexity
No
g
1 State the architecture of multilayer perceptron. CO5 Remember Low
2 Name any two activation functions CO5 Remember Low
3 Differentiate computer and human brain. CO5 Understand Low
Show the perceptron that calculates parity of its CO5
4 Understand Low
three inputs.
Prepared by
Mrs. M. Ambika, AP/CSE Page 20
What is stochastic gradient descent and why is it CO5
5 Understand Low
used in the training of neural networks?
Why is ReLU better than Softmax? Give the CO5
6 Understand Low
equation for both.
7 Define perception and its types. CO5 Remember Low
8 What are the activation functions of MLP? CO5 Understand Low
9 Why do you need activation function? CO5 Understand Low
How do you solve the vanishing gradient problem CO5
10 Understand Low
within a deep neural network?
Does stochastic gradient descent lead to faster CO5
11 Understand Low
training?
12 What are the limitations of perceptron? CO5 Understand Low
PART B
CO
Q.
Questions Mappin BT Level Complexity
No
g
Explain the step in the back propagation learning
1 algorithm. What is the importance of it in CO5 Understand Medium
designing neural networks?
Explain a deep feed forward network with neat
2 CO5 Understand Medium
sketch.
Elaborate the process of training hidden layers by
3 CO5 Understand Medium
ReLU in deep networks.
Briefly explain hints and the different ways it can
4 CO5 Understand Medium
be used
Draw the architecture of a single layer perceptron
5 (SLP) and explain its operation. Mention its CO5 Understand Medium
advantages and disadvantages.
How do you tune hyper parameters for better
6 CO5 Understand Medium
neural network performance? Explain in detail.
Compare and contrast shallow net and deep
7 CO5 Evaluate High
learning net.
List the factors that affect the performance of
8 CO5 Understand Medium
multilayer feed-forward neural network.
Discuss the architecture of a multilayer perceptron
9 (MLP) and explain its operation. Mention its CO5 Understand Medium
advantages and disadvantages.
Prepared by
Mrs. M. Ambika, AP/CSE Page 21
THANK YOU
Prepared by
Mrs. M. Ambika, AP/CSE Page 22
ALL THE BEST
Prepared by
Mrs. M. Ambika, AP/CSE Page 23