0% found this document useful (0 votes)
22 views24 pages

ANN Based Modeling

The document provides an overview of Artificial Neural Networks (ANN) and their modeling process, including data collection, preprocessing, and model training. It explains the architecture of neural networks, the learning process through training algorithms like back-propagation, and the importance of adjusting parameters for optimal performance. Additionally, it discusses the role of activation functions and the iterative nature of training to minimize error in predictions.

Uploaded by

sklahiri70
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views24 pages

ANN Based Modeling

The document provides an overview of Artificial Neural Networks (ANN) and their modeling process, including data collection, preprocessing, and model training. It explains the architecture of neural networks, the learning process through training algorithms like back-propagation, and the importance of adjusting parameters for optimal performance. Additionally, it discusses the role of activation functions and the iterative nature of training to minimize error in predictions.

Uploaded by

sklahiri70
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

ANN-based

modeling
What is modelling ?

x1

x2
Y=f(x) Y
x3

Find the f
What is modelling ?

x1

x2
Y=f(x) Y
x3

In ANN you give known input and output And ANN find the f
Steps
Step 1: Data collection and data inspection
Step 2: Data preprocessing and data conditioning
Step 3: Selection of relevant input output
variables
Step 4: Align data
Step 5: Model parameter selection, training and
validation
Step 6: Model acceptance and model tuning
Neural networks are computer
How ANN algo­rithms inspired by the way
information is processed in the
nervous system
works? An ANN is a massively parallel-
distributed processor

An important difference between


neural networks and standard
regression is their ability to learn

This learning property has


yielded a new generation of
algorithms
How ANN works?
An ANN paradigm is composed of
several highly interconnected
processing elements, analogous to
biological neurons that are tied
together with weighted connections
analo­gous to synapses

Learning in biological sys­tems


involves adjustments to the synaptic
connections between the neurons

This is true for ANNs as well


Network architecture
Bias NodeIn addition to the N and
L number of input and
hidden nodes, the MLP
architecture also
houses a bias node in
its input and hidden
layers
the bias nodes are also
connected to all the
nodes in the
subsequent layer and
they provide additional
adjustable parameters
for the model fitting
Number of hidden nodes

The number of nodes in the


MLP network’s input layer is
equal to the number of
inputs in the process
whereas the number of
output nodes equals the
number of process outputs

However, the number of


hidden nodes is an
adjustable parameter whose
magnitude is determined by
issues, such as the desired
approximation and
generalization capabilities of
the network model
ANN Learning

Learning typically occurs through training


or exposure to a true set of input/output
data where the training algorithm
iteratively adjusts the connection weights

These connection weights represent the


knowledge necessary to solve specific
problems

The most widely utilized ANN paradigm is


the multilayered perceptron that
approximates nonlinear relationships
existing between an input set of data and
the corresponding output data set
ANN Learning

It learns the approximation


through a numerical procedure
A three-layered MLP with a
called ‘network training’
single intermediate layer
wherein network parameters
housing a sufficiently large
are adjusted iteratively such
number of nodes can
that the network, in response
approximate any nonlinear
to the input patterns in an
computable function to an
example set, accurately
arbitrary degree of accuracy
produces the corresponding
outputs
ANN Algorithms

 There exists a number of algorithms –


each possessing certain positive
characteristics – to train an MLP network
• error-back-propagation

• Quickprop and Resilient Back-


propagation
Training of an ANN involves
minimizing a nonlinear error
function that may possess
several local minima

ANN Thus, it becomes necessary


Training to employ a heuristic
procedure involving multiple
training runs to obtain an
optimal ANN model whose
parameters correspond to
the global or the deepest
local minimum of the error
function
ANN
Training The building of a back-propagation
network involved the specification of the
number of hidden layers and the number
of neurons in each hidden layer

In addition, several parameters including


the learning rule, the transfer function,
the learning coefficient ratio, the random
number seed, the error minimization
algorithm, and the number of learning
cycles had to be specified
The back propagation algorithm
modifies network weights to
minimize the mean squared
error between the desired and
the actual outputs of the
network
Back Back propagation uses
supervised learning in which the
propagation network is trained in using data
for which input as well as

algorithm desired outputs is known

Once trained, the network


weights are frozen and can be
used to compute output values
for new input samples
The feed forward process
involves presenting an input
data to input layer neurons that
pass the input values onto the
first hidden layer

Back Each of the hidden layer nodes


computes a weighted sum of its

propagation
input and passes the sum
through its activation function
and presents the result to the
algorithm output layer

The goal is to find a set of


weights that minimize mean
squared error
Activation Function
 Each hidden node and output node applies the activation function to its net input
Different ANN Algorithms
Pseudo Code
of Back
propagation
algorithm
Pseudo Code of Back
propagation algorithm
Pseudo Code of Back
propagation algorithm
Pseudo Code of Back
propagation algorithm
Pseudo Code of
Back propagation
algorithm
Pseudo Code of Back propagation
algorithm

You might also like