0% found this document useful (0 votes)
195 views13 pages

CS8082-Machine Learning Techniques

This document contains a question bank for the Machine Learning Techniques course with questions divided into three parts: Part A (2 marks), Part B (13 marks) and Part C (15 marks). The questions cover topics from two units - Introduction and Neural Networks and Genetic Algorithms. Some of the questions ask students to define, explain or discuss concepts while others involve applications such as drawing decision trees or analyzing algorithms. The document aims to test students' understanding, application, analysis, evaluation and creation abilities for concepts taught in the Machine Learning Techniques course.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
195 views13 pages

CS8082-Machine Learning Techniques

This document contains a question bank for the Machine Learning Techniques course with questions divided into three parts: Part A (2 marks), Part B (13 marks) and Part C (15 marks). The questions cover topics from two units - Introduction and Neural Networks and Genetic Algorithms. Some of the questions ask students to define, explain or discuss concepts while others involve applications such as drawing decision trees or analyzing algorithms. The document aims to test students' understanding, application, analysis, evaluation and creation abilities for concepts taught in the Machine Learning Techniques course.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

www.rejinpaul.

com

SRM VALLIAMMAI ENGINEERING COLLEGE


SRM Nagar, Kattankulathur – 603 203

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

QUESTION BANK

VII SEMESTER
Regulation – 2017
Academic Year 2020 – 21 ODD
CS8082 MACHINE LEARNING TECHNIQUES

Prepared by

Mrs.S.Shanthi, Assistant Professor/CSE

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

SRM VALLIAMMAI ENGNIEERING COLLEGE


SRM Nagar, Kattankulathur – 603203.

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING


QUESTION BANK

SUBJECT : CS8082 MACHINE LEARNING TECHNIQUES


SEM / YEAR: VII/IV

UNIT I - INTRODUCTION
Learning Problems – Perspectives and Issues – Concept Learning – Version Spaces and
Candidate Eliminations – Inductive bias – Decision Tree learning – Representation –
Algorithm – Heuristic Space Search.

PART-A (2 - MARKS)
Q. BT
No QUESTIONS Competence Level
1. What is machine learning (ML) Remember BTL-1
Classify positive and negative examples for the target Apply BTL-3
2.
concept
Show the summary of choices in designing the checkers Apply BTL-3
3.
learning program
4. Point out few applications of machine learning. Analyze BTL-4
5. Define learn in terms of machine learning. Remember BTL-1
6. Analyze a decision tree for an example of play tennis. Analyze BTL-4
Summarize the various steps in designing a program to Evaluate BTL-5
7.
learn to play checkers.
8. Define concept learning as a search Remember BTL-1
9. Discuss the various issues in machine learning Understand BTL-2
Describe the four modules of final design in checkers Remember BTL-1
10.
learning problem.
11 Explain the useful perspective on machine learning. Evaluate BTL-5
12. Define the inductive Learning Hypothesis. Remember BTL-1
13. List the algorithms of concept learning. Remember BTL-1
14. Generalizes the concept of Biased Hypothesis Space. Create BTL-6

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

15. Explain Decision tree learning Analyze BTL-4


Understand BTL-2
16. Discuss what concept learning is.
Develop the instances for the EnjoySport concept learning Create BTL-6
17.
task.
Examine how we use the more-general-than partial ordering Apply BTL-3
18. to organize the search for a hypothesis consistent with the
observed training examples.
Express how are these three Hypotheses h1, h2,h3 from Understand BTL-2
19.
EnjoySport example related by the >=g relation?
20. Define set of instances with an example. Understand BTL-2
PART-B (13- MARKS)
1. Describe in detail the three features to have a well- (13) Remember BTL1
defined learning problem for the following
i) (i)A checkers learning problem
ii) (ii)A handwritten recognition learning problem
iii) (iv)A robot driving learning problem.
2. Discuss in detail how to design a program to learn to (13) Understand BTL2
play checkers.

3. (i).Describe in detail the rule for estimating training (7) Remember BTL1
values.
(ii)Describe the final design of checkers learning (6)
system.

4. Illustrate the useful perspective on machine (13) Apply BTL3


learning.

5. Discuss the Issues in machine learning. (13) Understand BTL2


6. (i)Generalize the concept learning task. (7) Create BTL6
(ii)Compose the Inductive Learning Hypothesis over (6)
the training example.
7. (i)Describe what is concept learning as search? (7) Remember BTL1
(ii)Describe the General-to-Specific Ordering of (6)
Hypotheses
8. (i) Illustrate with a diagram the decision tree (7) Apply BTL3
representation for the concept of playtennis.
(ii) Illustrate the appropriate problems for Decision

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

tree learning. (6)


9 (i)Explain in detail the FIND-S: FINDING A (7) Evaluate BTL5
MAXIMALLY SPECIFIC HYPOTHESIS.
(ii) Explain the key properties of FIND-S algorithm (6)
10 Explain the following : (7) Analyze BTL4
(i)Compact Representation for Version Spaces
(ii)The LIST-THEN-ELIMINATE Algorithm. (6)
11 Illustrate the basic decision tree algorithm. (13) Apply BTL3
12 (i) Describe what Candidate-Elimination Learning (3) Understand BTL2
Algorithm is.
(ii)Discuss in detail the Candidate–Elimination (10)
Algorithm with an example.
13 (i)Define Inductive Bias. (3) Remember BTL1
(ii)Describe biased Hypothesis Space. (10)
14 (i) Explain in detail an Unbiased Learner for Enjoy (7) Analyze BTL4
sport learning task.
(ii) Explain about The Futility of Bias-Free Learning (6)
.
PART-C (15- MARK )
1. Compose what is decision tree. Draw the decision (15) Create BTL6
trees to represent the following Boolean functions:
a)A ∧ ¬ B
b) A∨[B∧C]
c)A xor B
d)[A∧B] ∨[C∧D]
2. Compose some successful applications of machine (15) Create BTL6
learning.
3. Analyze the following: (15) Evaluate
(i)Will the Candidate –Elimination Algorithm
Converge to the Correct Hypothesis? BTL5
(ii)What Training Example Should the Learner
Request Next?
4. Explain the hypothesis space search in decision tree (15) Analyze BTL4
learning

UNIT II - NEURAL NETWORKS AND GENETIC ALGORITHMS


Neural

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

Network Representation – Problems – Perceptrons – Multilayer Networks and Back Propagation Algorithms
– Advanced Topics – Genetic Algorithms – Hypothesis Space Search – Genetic Programming – Models of
Evaluation and Learning.
.
PART-A (2 - MARKS)
Compete
Q.No QUESTIONS BT Level nce
1. Write the biological motivation for studying ANN. Create BTL6
2. Define ANN. Remember BTL1
3. Describe with an example Neural network representation. Remember BTL1
4. Define linearly separable sets of examples. Remember BTL1
List out the characteristic to which the back propagation Remember BTL1
5.
algorithm is used.
6. Compare and contrast the gradient descent and Delta rule. Analyze BTL4
Define perceptron and What are all the Boolean functions Remember BTL1
7.
represented by perceptron.
8. Define Back propagation algorithm. Evaluate BTL5
Discuss what type of unit shall we use as the basis for Understand BTL2
9.
constructing multilayer network?
Explain why perceptron to represent AND, OR, NAND and Analyze BTL4
10.
NOR is important.
11. How hypothesis in Genetic Algorithm is represented. Create BTL6
12. Describe what Genetic Algorithm is. Understand BTL2
13. What are the advantages of genetic algorithm? Understand BTL2
14. Illustrate Baldwin Effect. Apply BTL3
15. Difference crossover and mutation. Analyze BTL4
16. Define crowding. Remember BTL1
17. Explain what is genetic programming Evaluate BTL5
18. Illustrate the Lamarckian Evolution. Apply BTL3
19. Discuss what is Schema in GA Understand BTL2
Show the program tree representation in genetic Apply BTL3
20.
programming.

(i)Explain what is ANN and what are the advantages (7) Analyze BTL-4
of ANN
1.
(ii)Explain the prototypical example of ANN. (6)

2. Compose for which problems is ANN learning is (13) Create BTL6


well suited and write down the characteristics.

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

3. (i)Illustrate the diagram for visualizing the (7) Apply BTL3


Hypothesis space.
(ii).Analyze the derivation of the Gradient Descent (6)
Rule.
4. (i)Explain the derivation of the Back propagation (7) Evaluate BTL5
Algorithm.
(ii)Explain the Gradient Descent algorithm for (6)
training a linear unit
(i)Define Perceptrons. (7) Remember BTL1
(ii)Describe about perceptron with an example and (6)
5.
draw the decision surface represented by a two-input
perceptron.
(i)What are the Perceptron Training rule? (3) Remember BTL1
6. (ii)Describe Back propagation algorithm. (10)

(i)Distinguish between Gradient descent and Delta (5) Understand BTL2


7. rule.
(ii)Describe the delta training rule with an example. (8)
(i)Analyze how the hypothesis in GAs are (7) Analyze BTL4
represented by bit strings
8.
(ii)Explain how if -then rules can be encoded. (6)

(i)Define the genetic algorithm (8) Remember BTL1


9.
(ii)Describe the prototypical genetic algorithm (5)
(i)Illustrate the common operators for genetic (7) Apply BTL-3
algorithm
10. (ii) Show the diagram for the various crossover and (6)
explain the same.
(i)Define fitness function (5) Understand BTL2
(ii)Examine how genetic algorithm searches large (8)
11. space of candidate objects with an example
according to fitness function.

12 (i)Demonstrate hypothesis space search of Gas (7) Apply BTL3


with neural network back propagation.
(ii)Illustrate what is Add Alternative and Drop (6)
condition
13. Discuss in detail the Population Evolution and the (13) Understand BTL-2
Schema Theorem

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

(i)Examine what is genetic programming and draw (7) Remember BTL-1


the program tree representation in genetic
14. programming
(ii)Describe an example to explain the genetic (6)
programming
PART-C (15 -MARKS)
1. What is Inductive Bias and Generalize the Hidden (15) Create
BTL6
Layer Representations.
2 Explain in detail the following (15) Evaluate BTL5
(i)Alternative Error Functions
(ii) Alternative Error Minimization Procedures
3 Analyze the models of evolution and learning in (15) Analyze BTL4
Genetic algorithm
4 Explain the parallelizing Genetic Algorithms. (15) Evaluate BTL5
UNIT-III BAYESIAN AND COMPUTATIONAL LEARNING
. Bayes Theorem – Concept Learning – Maximum Likelihood – Minimum Description
Length Principle – Bayes Optimal Classifier – Gibbs Algorithm – Naïve Bayes Classifier –
Bayesian Belief Network – EM Algorithm – Probability Learning – Sample Complexity –
Finite and Infinite Hypothesis Spaces – Mistake Bound Model.

PART-A (2 - MARKS)
1. List the advantages of studying Bayesian learning methods. Remember BTL1
2. Define Bayes Theorem. Remember BTL1
3. Define Maximum likelihood. Remember BTL1
4. Define Minimum Description Length principle Remember BTL1
5. Define Bayes optimal classification Remember BTL1
6. Define Gibbs Algorithm. Remember BTL1
7. Give the formulas of basic probability Understand BTL2
8. Differentiate Bayes theorem and concept learning Analyze BTL4
9. Explain Bayesian belief networks Evaluate BTL5
10. Give the formula for probability density function. Understand BTL2
Generalize probably approximately correct (PAC) learning Create BTL6
11.
model
12. Illustrate the mistake bound model of learning. Apply BTL3
Differentiate sample complexity for infinite hypothesis Analyze BTL4
13.
spaces and finite hypothesis.
14. Compose sample complexity Create BTL6
15. Give the advantages of Em algorithm. Understand BTL2
16. Deduce €-exhausting the version space Evaluate BTL5

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

17.Give the Brute-Force Map Learning Algorithm. Understand BTL2


18.Explain the EM algorithm. Analyze BTL4
19.Show the set of three instances shattered by eight hypotheses. Apply BTL3
20.Discuss the Shattering a Set of Instances Understand BTL2
PART-B (13- MARKS )
(i)Discuss in detail about Bayes theorem with an (7) Understand BTL2
example.
1.
(ii)Discuss the features of Bayesian learning method. (6)

(i)Summarize in detail the relationship between (7) Evaluate BTL5


Bayes theorem and Concept learning.
(ii)Write down the Brute force Bayes Concept (6)
2.
Learning.

Explain maximum likelihood. (13) Analyze BTL4


3.
Illustrate with an example why Gibbs Algorithm is (13) Apply BTL3
4.
better than the Bayes Optimal classifier.
(i)Discuss minimum description length principle. (7) Understand BTL2
5. (ii)Discuss hall we conclude from this analysis of the (6)
Minimum Description Length principle.
(i)Describe what is Bayes optimal classifier (7) Create BTL6
6.
(ii)Describe the Bayes optimal classification. (6)
7. (i) Analyze the naïve Bayes classifier. (7) Analyze BTL4
(ii)Explain naive Bayes classifier with example. (6)

8. (i)Describe in Bayesian belief networks (7) Remember BTL1


(ii)Describe the conditional Independence. (6)
9. (i) Describe in the EM algorithm (7) Remember BTL1
(ii)Describe in detail Estimating Means of k (6)
Gaussians
10. (i)Describe in detail probability learning (7) Remember BTL1
(ii)Describe in detail Error of a Hypothesis (6)
11. Pointout the three PAC Learnability (13) Analyze BTL-4
(i)Show sample complexity for finite hypothesis (7) Understand BTL-2
12. spaces. (6)
(ii) Discuss the mistake bound model of learning.
(i)What is the €-exhausting the version space (7) Remember BTL-1
13.
(ii) Examine Learning and Inconsistent Hypotheses (6)

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

(i)Illustrate sample complexity for infinite (7) Apply BTL-3


14. hypothesis spaces
(ii)Demonstrate the vapnik-chervonenkis dimension (6)
PART-C(15 -MARKS)
1. Create MAP Hypotheses and Consistent Learners. (15) Create BTL-6
(i)Explain Bayesian belief network. (15) Evaluate BTL5
2. (ii) Explain how Bayesian network is used to infer
values of target variable?
(i) Generalize Learning Bayesian Belief Networks (15) Create BTL-6
(ii) Generalize Gradient Ascent Training of Bayesian
3.
Networks

(i)Summarize General Statement of EM Algorithm. (15) Evaluate BTL5


4.
(ii) Deduce k -Means Algorithm.
UNIT IV- INSTANT BASED LEARNING
K-NearestNeighbourLearning–LocallyweightedRegression–RadialBasisFunctions – Case
Based learning
PART-A (2 -MARKS)
1. Define the formula for the distance between two instance Remember BTL1
2. Show the radial basis function network. Apply BTL3
3. What is k-nearest neighbor learning algorithm Remember BTL1
Illustrate how the Instance-based learning methods differ Apply BTL3
4.
from function approximation.
Explain the The k-nearest neighbour algorithm for Analyze BTL4
5.
approximating a discrete-valued function
What is the nature of the hypothesis space H implicitly Remember BTL1
6.
considered by the k-nearest neighbor algorithm?
7. Define the locally weighted regression. Remember BTL1
8. Define distance-weighted nearest neighbor algorithm Remember BTL1
9. Define curse of dimensionality Remember BTL1
10. Differentiate Regression, Residual, Kernel function Analyze BTL4
11. Give the advantages of instance –based methods. Understand BTL2
Discuss the advantage and disadvantage of Locally weighted Understand BTL2
12.
Regression.
13. Difference between lazy versus eager learning? Understand BTL2
14. Compose the three properties that is shared by the Instance- Create BTL6
based methods.
15. Summarize the three lazy learning methods Evaluate BTL5

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

16. Show the voronoi diagram for k nearest neighbour. Apply BTL3
17. Explain radial basis functions Evaluate BTL5
Compose the formula for Locally Weighted Linear Create BTL6
18.
Regression.
19. Analyze what is the inductive bias of k-nearest neighbor? Analyze BTL4
20. Distinguish between CADET and k-nearest Neighbor. Understand BTL2
PART-B (13- MARKS )
1. i) Illustrate the disadvantages of Instance –based (7) Apply BTL3
methods.
ii) Examine the k-nearest learning algorithm. (6)
2. Explain in detail about distance-weighted nearest (13) Evaluate BTL5
neighbour algorithm.
3. (i)Generalize Locally weighted linear regression. (13) Create BTL6
(ii)Illustrate Locally Weighted Linear Regression
with an example.
4. (i)Explain the radial basis functions. (7) Analyze BTL4
(ii)Describe the two stage process of the RDF (6)
networks.
(i)Discuss in detail about locally weighted regression (7) Understand BTL 2
5. (ii)Discuss the pros and cons of Locally weighted (6)
regression.
i) Explain what is the inductive bias of k-Nearest (7) Analyze BTL4
neighbor algorithm
ii) Analyze the following: (6)
6. (a)Regression
(b)Residual
(c)Kernel function.

Discuss the generic properties of case-based (13) Understand BTL2


7.
reasoning systems.
Describe the prototypical example of case-based (13) Remember BTL1
8.
reasoning system.
9. Write in detail about lazy learning. (13) Remember BTL1
Examine the Instance-based learning methods Remember BTL1
10.
(13)
(i)Explain in detail about eager learning. ( 7) Analyze BTL4
11. (ii)Point out how the eager learning differs from lazy (6)
learning?
12. Illustrate several generic properties of case –based (13) Apply BTL-3

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

reasoning systems
(i)Describe CADET system. ( 7) Understand BTL-2
13.
(ii)Describe CADET system with an example (6)
Describe the disadvantages and advantages of Lazy (13) Remember BTL-1
14.
and Eager learning
PART-C (15-MARKS)
1. Summarize the Case-based reasoning (CBR). (15) Evaluate BTL-5
2. Compare lazy and eager learning algortihms (15) Analyze BTL4
3. Generalize what is Locally weighted regression. (15) Create BTL-6
4. Compose the error E(x,) to emphasize the fact that (15) Create BTL-6
now the error is being defined as a function of the
query point x,.

UNIT V- ADVANCED LEARNING


. Learning Sets of Rules – Sequential Covering Algorithm – Learning Rule Set – First Order
Rules – Sets of First Order Rules – Induction on Inverted Deduction – Inverting Resolution
–Analytical Learning–Perfect Domain Theories–Explanation Base Learning – FOCL
Algorithm – Reinforcement Learning – Task – Q-Learning – Temporal Difference Learning

PART-A (2 -MARKS)
1. What is” explanation based learning” Remember BTL1
2. Illustrate first-order Horn clauses. Apply BTL3
3. State the learn-one-rule. Remember BTL1
4. Illustrate what is Sequential Covering Algorithm. Apply BTL3
5. Illustrate what is Prolog-EBG. Apply BTL3
6. Describe Inverting resolution. Remember BTL1
7. List out the terminology in Horn clause. Remember BTL1
8. Define Turing-equivalent programming language Remember BTL1
9. Define Reinforcement learning. Remember BTL1
Point out how the learn rule sets differ from genetic Analyze BTL4
10.
algorithm.
11. Give the importance of Temporal learning. Understand BTL2
Discuss about the sequential covering algorithm for learning Understand BTL2
12.
a disjunctive set of rules.
13. Distinguish between the FOIL and the other algorithm. Understand BTL2
14. Generalize induction as inverted deduction. Create BTL6
15. Explain inductive logic programming. Evaluate BTL5
16. Analyze the Q learning algorithms Analyze BTL4

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

17. Compare Inductive and Analytical Learning Problems Evaluate BTL5


18. Develop a Proportional form if clauses C1 and C2 is given. Create BTL6
19. Pointout the about Horn clause. Analyze BTL4
20. Distinguish between FOCL and FOIL algorithm. Understand BTL2
PART-B(13 MARKS )
Explain in detail learning sets of rules and how it
1. (13) Evaluate BTL5
differs from other algorithms.
(i)Explain Sequential Covering Algorithm (7)
2. (6) Analyze BTL4
(ii)Explain the Learn one rule on one example
3. Discuss the learning task (13) Understand BTL2
4. (i)Write in detail sequential –covering algorithm. (7) Remember BTL1
(ii)Write the AQ algorithm. (6)
Explain in detail the first order logic basic
5. (13) Analyze BTL4
definitions.
(i)Illustrate the diagram for the search for rule (7) Apply BTL3
preconditions as learn-one-rule precedes from general
6. to specific.
(ii)Discuss the implementation algorithm for Learn- (6)
one-rule
(i)Analyze the learning Rule sets. (7)
7. (ii)Write some common evaluation functions in the (6) Analyze BTL4
learning rule sets.
8. Demonstrate about induction as inverted deduction (13) Apply BTL3
9. Discuss in detail Learning First –order rules. (13) Understand BTL2
(i)Describe learning sets of first-order rules: foil (7) Remember BTL1
10. (ii)Describe the Basic Foil algorithm. (6)
(i)Describe about learning with perfect domain (7) Remember BTL1
11. theories: prolog-eb
(ii)Describe a training example for PROLOG-EBG (6)
12. Summarize Qlearning (13) Understand BTL2
(i) Generalize what is Reinforcement learning? (7)
13. Create BTL6
(ii) Compose Temporal difference learning. (6)
14. Describe Analytical learning. (13) Remember BTL1
PART-C(15 MARKS)
Compose the following horn clauses
1. (i) First-Order Horn Clauses (15) Create BTL6
(ii) Basic terminology in horn clauses.

Download updated materials from Rejinpaul Network App


www.rejinpaul.com

2. Generalize the concept of inverting resolution (15) Create BTL6


3. Summarize FOCL Algorithm (15) Evaluate BTL5
Analyze thee explanation based training example of (15) Analyze
4. BTL4
prolog EBG

Download updated materials from Rejinpaul Network App

You might also like