0% found this document useful (0 votes)
23 views

Willo - MLX PDF Part3

The document defines a distributed delay neural network with 1 input, 2 layers, and 1 output. It trains the network on input and target data using Levenberg-Marquardt backpropagation, achieving a mean squared error of 0.3538. It then tests the trained network on new longer input and target data, achieving a mean squared error of 0.3538.

Uploaded by

tinashe murwira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Willo - MLX PDF Part3

The document defines a distributed delay neural network with 1 input, 2 layers, and 1 output. It trains the network on input and target data using Levenberg-Marquardt backpropagation, achieving a mean squared error of 0.3538. It then tests the trained network on new longer input and target data, achieving a mean squared error of 0.3538.

Uploaded by

tinashe murwira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

k1 = 0:0.

01:1;
p1 = sin(4 * pi * k1);
t1(1:length(k1)) = -1;
k2 = 1.86:0.01:3.86;
p2 = sin(sin(k2).*k2.^2+5*k2)

p2 = 1×201
0.0495 0.1248 0.1991 0.2718 0.3427 0.4113 0.4773 0.5403

t2(1:length(k2)) = 1;
R = [4,3,0];

P = [repmat(p1,1,R(1)),p2,repmat(p1,1,R(2)),p2,repmat(p1,1,R(3)),p2];
T = [repmat(t1,1,R(1)),t2,repmat(t1,1,R(2)),t2,repmat(t1,1,R(3)),t2];
net = distdelaynet({0:4,0:4},8);
display(net);

net =

Neural Network

name: 'Distributed Delay Neural Network'


userdata: (your custom info)

dimensions:

numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 4
numLayerDelays: 4
numFeedbackDelays: 4
numWeightElements: 8
sampleTime: 1

connections:

biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]

subobjects:

input: Equivalent to inputs{1}


output: Equivalent to outputs{2}

inputs: {1x1 cell array of 1 input}


layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}

functions:

adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'

1
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'time'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', 'plottrainstate', 'ploterrhist',
'plotregression', 'plotresponse', 'ploterrcorr',
'plotinerrcorr'}
plotParams: {1x7 cell array of 7 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max

weight and bias values:

IW: {2x1 cell} containing 1 input weight matrix


LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors

methods:

adapt: Learn while in continuous use


configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs

[Ps,Pi,Ai,Ts] = preparets(net,con2seq(P),con2seq(T));
net.trainFcn = 'trainbr';
net.trainParam.epochs = 100;
net.trainParam.goal = 1e-5; net = init(net);
net = train(net,Ps,Ts); Y = net(Ps,Pi,Ai);

2
W = net.IW{1}

W = 8×5
1.0833 2.0494 0.4089 1.1598 -2.5191
-2.1662 -1.2528 -0.6011 0.9597 2.6424
-0.9350 -0.4670 0.0556 1.3731 3.0067
-0.8452 1.4999 1.3173 -0.3529 -3.1347
3.3030 0.3361 -2.0088 -1.9682 2.5632
-3.7298 -0.1083 1.4136 1.7032 2.1764
1.6004 1.2752 -0.3178 -1.7666 0.3909
0.3588 0.2976 0.0540 0.1976 0.7564

LW = net.LW{2,1}

3
LW = 1×40
2.9562 4.6117 4.0238 0.2070 -0.0862 5.0295 3.1802 3.1336

b1 = net.b{1}

b1 = 8×1
-2.3182
1.3867
3.2479
0.1045
2.5428
-0.1121
-0.8002
1.1226

b2 = net.b{2}

b2 = -1.4085

error = cell2mat(Y)-cell2mat(Ts);
mse_error = sqrt(mse(error))

mse_error = 0.3538

X = 1:length(Y);
plot(X,cell2mat(Ts),X,cell2mat(Y)),grid;
legend('reference','output');

plot(X,error),grid;
legend('error');

4
R = [4,5,0];
P2 = [repmat(p1,1,R(1)),p2,repmat(p1,1,R(2)),p2,repmat(p1,1,R(3)),p2];
T2 = [repmat(t1,1,R(1)),t2,repmat(t1,1,R(2)),t2,repmat(t1,1,R(3)),t2];
[Ps2,Pi2,Ai2,Ts2] = preparets(net,con2seq(P2),con2seq(T2));
Y2 = net(Ps2,Pi2,Ai2);
X2 = 1:length(Y2);
error2 = cell2mat(Y2)-cell2mat(Ts2);
mse_error2 = sqrt(mse(error2));

You might also like