University of Ngaoundéré Academic Year 2021/2022
Faculty of Sciences Master 1 / Semester 2
Department of Mathematics and Computer Science
TDR Network of Automata (INF-INF462)
Exercise -1Define or explain the following concepts:
Artificial neural network
Supervised Machine Learning
Unsupervised machine learning
Reinforcement Learning
Briefly explain the optimization algorithm based on gradient descent.
gradient.
Summarize the backpropagation algorithm for training.
neural networks.
Exercise -2Simple Perceptron
We consider a simple perceptron with two inputs and one output, and a function
1six w > 0 0
Next activation: a(x)
0otherwise
1) Find the weights so that the perceptron calculates the logical AND function
2) Same question with the OR logical function.
3) Try to find weights for the XOR function.
4) Build a neural network that computes the XOR function.
Exercise -3Update of weights for multilayer networks
We saw in class how to update the weights in the case of a single neuron using
the descent of gradient. The objective of the exercise is to address the case of networks with two layers:
a layer of neurons that is connected to the inputs and a layer of neurons for the output.
Notations:
x yesdesignates the entry j of unit i
w hidesignates the weight associated with entry j of unit i
net j wxhiis hithe weighted sum of the inputs
i
O j is the output value of unit j
t j is the expected value of unit j
is the sigmoid function
outputs all the neurons of the output layer
downstream(j) is the set of neurons that use the unit in input
1
We measure the error of an instance d with the following function Ed(w) tk Ok 2
2k out
1
(x)
What is the derivative of the sigmoid function?
1 e x
2) Using gradient descent, write the update formula
a) for the output units.
b) for the other units.
Exercise -4Train a perceptron to express the conjunction x y.
Page 1
University of Ngaoundéré Academic Year 2021/2022
Faculty of Sciences Master 1 / Semester 2
Department of Mathematics and Computer Science
TDR Network of Automata (INF-INF462)
Exercise -5Seventy-two boolean variables.
1) Design a two-input neural network to implement the function
booleanex y
2) Design a two-layer neural network implementing the Boolean function
xXORy
Exercise -6Consider the multilayer neural network described by the following graph:
1) Provide the mathematical formulas that determine the intermediate outputs f.11,
f12,h11,h1221as well as the final output ̂.
2) Let the error function be: ( ) =( − ̂ ) 2By applying the algorithm of
backpropagation, find the expressions of the updates
day of parameters ∆w for j = 1, …, 7.
Exercise -7Consider the multilayer neural network described by the following graph:
Let the data (x, y) = (2, 1)
1) Calculate the intermediate outputs11,f12,h11,h12,f21as well as the final output ̂.
2) Calculate the parameters ∆ jand jfor j = 1, …, 7 after an update iteration (in
considering the learning parameter =0.1).
Page 2