0% found this document useful (0 votes)
37 views3 pages

Assignment 2

Uploaded by

kulsoom abedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views3 pages

Assignment 2

Uploaded by

kulsoom abedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

COSC _4362

Artificial Intelligence
Assi gnme nt No . 2

Date: April 28th, 2025 Due Date: May 8th, 2025

Instructions (Please read carefully):


• Attempt all questions. Email me your solution, before the due date.
• Show your work and write your solutions legibly. You can use MS Word or Latex to type in
your solutions for the written portion of the assignment or use applications such as
CamScanner /Abobe Scan to create a PDF file from your handwritten solutions
• You are encouraged to discuss the assignments, but duplicates / copies of the solutions
will lead to zero credit. Make sure that you submit YOUR OWN SOLUTION. If two
assignments are found to be copies of each other, they BOTH receive 0.

Part 1: Expert Systems


Q. No. 1. Define an Expert System. Give two examples where Expert Systems are
practically used.
Q. No. 2. List and explain the components of an expert system.
Q. No. 3. Describe the role of an inference engine.
Q. No. 4. Differentiate between forward chaining and backward chaining, with a
simple example.
Q. No. 5. Design a simple rule-based expert system for diagonozing flu symptoms.
Q. No. 6. What is a knowledge base? How is it constructed?
Q. No. 7. Given rules:
• If A and B, then C.
• If C and D, then E.

Facts: A = True, B = True, D = True.

Use Forward Chaining to deduce new facts.

Q. No. 8. Rule-Based Inference: Given the following rules in an expert system:


• IF temperature > 100 THEN illness = fever
• IF cough = True AND fever = True THEN illness = flu
• IF flu = True THEN action = take_medicine

Facts: temperature = 102, cough = True

Determine: What is the inferred action?

Page 1 of 3
Q. No. 9. Backward Chaining: Given the rules
• IF A AND B THEN C
• IF D THEN A
• IF E THEN B

Facts: D = True, E = True

Goal: Prove C is True using backward chaining.

Part 2: Perceptrons and Adaline


Q. No. 10. Perceptron Activation: Given
• Inputs: 𝑥1 = 0, 𝑥2 = 1
• Weights: 𝑤1 = 0.3, 𝑤2 = −0.4
• Bias: 𝑏 = 0.2
1 if 𝑧 > 0
• Activation Function: Step Function 𝑦 = { ,
0 otherwise
• Threshold 𝜃 = 0

Compute the output 𝑦.

Q. No. 11. Perceptron Training (1 Epoch):

Training Data:

𝑥1 𝑥2 Target (𝑑)
1 0 1
0 1 1
−1 0 0

• Initial Weights: 𝑤1 = 0.2, 𝑤2 = −0.1, 𝑏 = 0.3


• Learning Rate: 𝜂 = 0.1
• Activation Function: Step function (with theshold 𝜃 = 0)

Task: Update weights after 1 epoch (process all sample once).

Q. No. 12. Perceptron Training (Convergence Check):

Training Data:

𝑥1 𝑥2 Target (𝑑)
0 0 0
Page 2 of 3
0 1 1
1 0 1

• Initial Weights: 𝑤1 = 0, 𝑤2 = 0, 𝑏 = 0
• Learning Rate: 𝜂 = 0.5
• Activation Function: Step function (with theshold 𝜃 = 0)

Task: Train for 2 epoch and check if the model converges.

Q. No. 13. Adaline Training (1 epoch):

Training Data:

𝑥1 𝑥2 Target (𝑑)
1 1 1
−1 −1 0

• Initial Weights: 𝑤1 = 0.1, 𝑤2 = 0.1, 𝑏 = 0.1


• Learning Rate: 𝜂 = 0.1
• Activation Function: Step function (with theshold 𝜃 = 0)

Task: Train for 1 epoch using the LMS rule and find new weights.

Q. No. 14. Adaline Training (2 epoch):

Training Data:

𝑥1 𝑥2 Target (𝑑)
1 0 1
0 1 1

• Initial Weights: 𝑤1 = 0.4, 𝑤2 = −0.2, 𝑏 = 0.1


• Learning Rate: 𝜂 = 0.05
• Activation Function: Step function (with theshold 𝜃 = 0)

Task: Train for 2 epoch using the LMS rule and find new weights.

Q. No. 15. What is meant by 'linear separabillity'? What happens if data is not linearly
spearable?
Q. No. 16. What's the main difference(s) between Perceptron learning and Adaline
learning?
Q. No. 17. Explain why Adaline can converge even when Perceptron fails.

Page 3 of 3

You might also like