0% found this document useful (0 votes)
392 views

Pythia

k

Uploaded by

SS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
392 views

Pythia

k

Uploaded by

SS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 29

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

1 of 29 pages

Pythia The Neural Network designer


Introduction
Pythia is a program for the development and design of Neural Networks.
Neural Networks are used to detect hidden relations in a set of patterns, e.g. stock market
data or weather data.
Pythia features Backpropagation Networks. The network parameters (Weights) are
initially set to random value. During the Training phase the actual output of the
network is compared with the desired output and the error propagated back toward the
input of the network.
A Neural Network has two phases, commonly referred to as the Training phase and the
Reproduction phase. During the training phase sample data containing both inputs
and desired outputs are processed to optimize the networks output, meaning to
minimize the deviation
(OUTPUTDATA OUTPUTNET)2
OUTPUTDATA is the output value in the training data, OUTPUTNET is the output value
provided by reproducing the input data with the network.

During the reproduction phase the networks parameters are not changed anymore and
the network is used for the reproduction of input data in order to predict suitable output
data.
Picture 1 shows a typical Backpropagation Network. It has 2 inputs and 1 output. It
contains two layers (levels) of neurons, level 1 with 2 neurons and level 2 with 1 neuron.
In Backpropagation
Networks each neuron
has one output and as
many inputs as neurons in
the previous level.
Each network input is
connected to every
neuron in the first level.
Each neuron output is
connected to every
neuron in the next level.

Picture 1 A typical Backpropagation network

The Networks output is the output of the last levels neurons.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

2 of 29 pages

Each neurons output is calculated as


On = F(Ik * Wkn)
k

O ist the neurons output, n is the number of the neuron,


Ik are the neurons inputs, k is the number of inputs,
Wkn are the neurons weights.
F is the Fermi function 1/(1+Exp(-4*(x-0.5)))

The network is processed from the left to the right. Picture 2 shows a sample how the
output of a Neural Network is calculated from the input.

Picture 2 Sample calculation of a Network


The activity of N1 is calculated as
A = (1*0.249733)+(0*-0.233776) = 0.249733
The output of N1 is calculated as
O = Fermi(A)= 1/(1+Exp(-4*(0.249733-0.5))) = 0.268731
The outputs of N1 and N2 are the inputs for the calculation of N3

Pythia allows you to import data from different file formats or from spreadsheet
programs like Microsoft Excel. You can design and train Neural Networks. Both, data
and networks can easily be stored on disk.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

3 of 29 pages

A special feature of Pythia is the Evolutionary Optimizer that automatically generates


suitable networks for a given training data set. It uses evolutionary algorithms for the
selection and generation of neural networks.
In order to get familiar with Pythia we recommend to work through the two examples we
discuss below.

Example 1 XOR problem


A classic example for Neural Networks to learn is the XOR problem. The network is
supposed to learn the pattern:

Pattern 1
Pattern 2
Pattern 3
Pattern 4

Input 1
0
0
1
1

Input 2
0
1
0
1

Output
0
1
1
0

Picture 3 The XOR problem


The network we need to design will have 2 inputs and 1 output.
First of all we need to get the training patterns (0,0,1), (0,1,1), (1, 0, 1) and (1,1,0) into
Pythia.
The easiest way is using a
spreadsheet program like
Excel.

Enter the pattern as


shown in Picture 4.
Select columns A-C
and rows 1-4 with the
mouse.
Select EDITCOPY
or simply press
CTRL-INS.

The data are now in the


windows clipboard.

Picture 4 Data entry into an Excel table

Next step is calling Pythia.


Initially Pythia will look as shown in Picture 5.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

Picture 5 Pythia after the program start

Select EDITPASTE CELLS or simply press SHIFT-INSERT.


You will get a dialog box as shown in
Picture 6.

Leave the Field delimiter (TAB)


and the Fields per record (3).

Paste to new Pattern Set is correct


too, because we want to create a
whole new pattern set.

Notice that No outputs is set to 1


by default. If you network is
supposed to have more than just
one output, you must specify it
here.

Finally press the Ok button.


Picture 6 Paste dialog

4 of 29 pages

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

5 of 29 pages

Picture 7 shows Pythia after pasting the pattern set data.

Picture 7 The pattern set data have been pasted into a new table
You see the columns I1, I2 and O1. The columns O1(NET) and (SQ DV) are empty
because they are filled later by the Neural Network we will design now.
It is a good idea to save the pattern set under the name XOR.PAT now.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

6 of 29 pages

To create a new Neural Network


do the following steps:

Select NETCREATE
NET

Enter the number of


inputs your net will have
(2)

Enter the number of


neurons in level 1. We
select 2.

Enter the number of


neurons in level 2. Since
this is the last level we
define, it is equal to the
output level. Therefore
select 1.

Finally press the Ok


button.

Picture 8 Create a new Neural Network

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

7 of 29 pages

Picture 9 A new created Neural Network


Now try to reproduce the pattern set with the new network.

Select NETREPRO
PATTERN SET

As you see in Picture 10 the results are


quiet poor. The column O1(NET)
should be similar to O1.
The most right column SQ DV tells
you about the square of the deviation
and is defined as

Picture 10 Reproduction with the


untrained net

SQ DVi = (O1i O1(NET)i)2


i is the pattern number
Example: (0-0.071094)2 = 0.005054

It is the goal of the training phase to minimize these values. When you look at the desired
outputs in column O1 you will see that the output is either 0 or 1. Consequently a

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

8 of 29 pages

O1(NET) value of <0.5 will be sufficient for 0 and a value >0.5 will be sufficient for 1.
This means the deviation for each reproduced pattern must be <0.5, meaning the square
of this deviation must be < 0.25.
Lets setup a learn plan.

Select NETLEARN
PATTERN SET

Picture 11 shows the dialog window


for setting up the learn plan.

Please leave the default values


with the exception of
*deviation. Check this item and
enter the value 0.25.

Press the Ok button.

If every deviation is below 0.25 the net


is trained sufficiently. This is probably
not the case the first time you try it.
The reason is that not all starting value
for the neurons weights cause this
network to learn the pattern set.
Picture 11 Set up a learn plan
Please set new random weights before
you try the training again. You can do
this either by checking the box Set weights randomly or by selecting the appropriate
menu point from the main menu bar (NETSET RANDOM WEIGHTS).
Do the training again.
Repeat until any deviation is below 0.25.
After achieving this goal the network is trained sufficiently in order to predict the correct
outcome of the possible inputs (0,0), (0,1), (1,0) and (1,1).
You can save the network on disk now.
It should be mentioned that the pattern set XOR.PAT would be learned by a larger Neural
Network on the first trial. However, there are several reasons to prefer a network as small
as possible. One reason is the performance of the network. A small one is calculated
faster. The second, even more important reason, is the networks ability to abstraction. A
large network might be able to learn the training patterns, but might fail on slightly
different data during the reproduction phase.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

9 of 29 pages

Example 2 Stock Market Prediction


A nice application for Neural Network is the prediction of the stock market. We keep our
following example very simple and merely show how to prepare the stock market data for
input into a Neural Network. Results should always be subject to human interpretation
and it is always the investors risk to bet any real money on it.
It must be stated clearly: No Neural Network can ever predict a market crash or any point
gain or loss due to actual economical or political events you dont know in advance. But a
Neural Network might be able discover relations between stock market data that are not
obvious.
Our example will examine the Dow Jones Industrial Index in 1999.
Picture 12 shows some
DJI data for 1999.
Our idea is that there
might be a relation
between todays open,
high, low, close and
volume and tomorrows
close.

Date
Open
High
Low
Close
Volume
1/4/99 9184.01 9350.33
9122.47 9184.27
883
1/5/99 9184.78 9338.74
9182.98 9311.19
779
1/6/99 9315.42 9562.22
9315.42 9544.97
986
1/7/99 9542.14 9542.14
9426.02 9537.76
857
1/8/99 9538.28 9647.96
9525.41 9643.32
940
1/11/99 9643.32 9643.32
9532.61 9619.89
816

:
:
Picture 12 DJI open, high, low, close and volume in 1999

Using the values directly


as input into our network
is not advisable because our interest is merely the percent drop and gain. We therefore
derive our input data from the original data in the following way:
Open(t)
High(t)
Low(t)
Close(t)
Volume(t)

= % change day t-1 day t


= % rel. to Open
= % rel. to Open
= % change day t-1 day t
= % change day t-1 day t

= (Open(t)-Open(t-1))/Open(t-1)*100
= (High(t)-Open(t))/Open(t)*100
= (Low(t)-Open(t))/Open(t)*100
= (Close(t)-Close(t-1))/Close(t-1)*100
= (Volume(t)-Volume(t-1))/Volume(t-1)*100

The output we define as


Close(t+1)

= % change day t day t+1

= (Close(t+1)-Close(t))/Close(t)*100

The formulas can be entered easily into a spreadsheet table and will look as shown in
picture 13:

Pythia The Neural Network Designer 2000 by Runtime Software

Date

Open

High

Low

Close

Volume

- 11/28/15 - 6:44 PM

Open

High

10 of 29 pages

Low

Close

Volume Close+1

12/31/98 9274.12 9287.77 9181.43 9181.43

755 -0.50274

0.147184 -0.99945

-1.005 27.10438

0.030932

1/4/99 9184.01 9350.33 9122.47 9184.27

883 -0.97163

1.810974 -0.67008 0.030932 16.95364

1.381928

1/5/99 9184.78 9338.74 9182.98 9311.19

779 0.008384

1.676251

2.510742

1/6/99 9315.42 9562.22 9315.42 9544.97

986 1.422353

2.649371

1/7/99 9542.14 9542.14 9426.02 9537.76

-0.0196 1.381928

-11.778

0 2.510742 26.57253

-0.07554

857 2.433814

0 -1.21692 -0.07554 -13.0832

1.106759

1/8/99 9538.28 9647.96 9525.41 9643.32

940 -0.04045

1.149893 -0.13493 1.106759 9.684947

-0.24297

1/11/99 9643.32 9643.32 9532.61 9619.89

816 1.101247

0 -1.14805 -0.24297 -13.1915

-1.50948

1/12/99 9618.86 9620.15 9451.77 9474.68

794 -0.25365

0.013411 -1.73711 -1.50948 -2.69608

-1.32057

1/13/99 9471.34 9471.34

9213.1 9349.56

935 -1.53365

0 -2.72654 -1.32057 17.75819

-2.44536

1/14/99 9349.56 9359.08 9087.72 9120.93

798 -1.28577

0.101823 -2.80056 -2.44536 -14.6524

2.407868

1/15/99 9127.16 9342.61 9127.16 9340.55

921 -2.37872

2.360537

0.157057

0 2.407868 15.41353

Picture 13 DJI data modified for input into a Neural Network


Now you can select and copy the right 6 columns and paste them into Pythia. You also
can simply load the sample file DOW99I.PAT that comes with Pythia.
Our approach will be to train a network with the data of the 1st half year 1999. With the
then trained network we will examine the output for the reproduction of the 2nd half of
1999.
Our Neural Network expects any input and output value to be between 0 and 1. Therefore
the pattern sets must be normalized before being processed by the network.
The normalization is calculated:
N(i) = (i - low) / (high - low)
i is the input (or output value)
low is the minimum possible value
high is the maximum possible value

This is usually done automatically, but we should take the time to examine normalization
parameters now.
Select PATTERNOPTIONS. For our current example you see the options as shown in
Picture 14.
Input1 for example has possible values between 2.378722 and 2.846741.
Suppose you have to normalize the value I1=0.5. You would calculate:
N(0.5) = (0.5 (-2.378722)) / (2.846741- (2.378722))
= 0.55090276

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

11 of 29 pages

Once you create a new pattern set


Pythia calculates the lowest and the
highest value for each column.
You can change these values in the
option menu if you like.
If you train a network with a certain
pattern set you must choose the same
normalization parameters for a
pattern set you are going to
reproduce with this network.
The display format parameters only
specifies how the number are
displayed in the table.
Picture 14 Options for pattern sets

Now lets design an appropriate network.

Create a new network with the values (5,5,7,1). This network has the required 5
inputs and 1 output.
Select NET LEARN PATTERN SET.
Leave the default values except the repetition count, which we set to 3000.
Press the Ok button

The training pattern set will now be processed 3000 times. Finally the trained network
automatically reproduces the inputs of the pattern set.
The results can be examined in Picture 15. Have a look at the columns O1 and NET(O1).
As you remember, NET(O1) is the output generated by the Neural Network, O1 is the
output we would like to get.
The values describe the percentage drop or gain between two days close values.
Our network at least has learned some of the patterns. Especially if our network makes a
significant statement (lets say it predicts a gain or drop of 0.6% or more) it is usually
right with the direction the DJI moves.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

12 of 29 pages

Picture 15 The network after the training

Now lets test the same network with the 2nd half year data. The network has not been
trained with these. So they are completely unknown to the network.

Please load dow99ii.pat into Pythia. Note that the normalization options are the
same for dow99i.pat and dow99ii.pat.

Reproduce this pattern set.

You see that this time the outcome is very poor.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

13 of 29 pages

Picture 16 Reproducing the 2nd half 1999 data generates poor results

Next step would be to modify the model, usually by adding input parameters like interest
rates, overseas market indices, currency ratio etc.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

14 of 29 pages

Evolutionary Optimizer
The Evolutionary Optimizer is a tool for generating Neural Networks. The only thing you
must provide is the training pattern set. Lets see how the Evolutionary Optimizer
generates a network for the XOR sample:

Load the pattern set XOR.PAT into


Pythia.
Select NET EVOLUTIONARY
OPTIMIZER

You will see a dialog window as shown in


Picture 17.

Uncheck the deviation field


Change the *deviation field to
0.25
Change the # neurons field to 3
Press the Ok button

The Evolutionary Optimization will start


processing now. It will stop if the Goals to
achieve are true, meaning
*deviation < 0.25 and
# neurons <= 3
However, if will perform max. 1000
evolution steps.

Picture 17 The Evolutionary Optimizer

You can find further explanations of the options in the reference part of this manual.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

15 of 29 pages

Picture 18 The Evolutionary Optimizer generated a network


After a couple of generations the optimizer finally found a suitable network in line 11.
This network has 3 neurons and the max. deviation for any pattern is 0.078666.
Click the checkbox on the left of this line and press the Ok button. You now have a
network that is sufficiently trained for the XOR problem.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

16 of 29 pages

Reference
The reference part of this manual will describe in detail the menu items and the settings
in the various dialog boxes.
The Main Window
Pythias main windows contains three areas:

The pattern set area on the left side


The network area on the right side
A log window on the bottom

The pattern and the network area can contain any number of pattern sets or networks.
Any operation a user selects will process the currently selected pattern set or network.
The log windows displays the last transactions.
FILE EXIT
This terminates Pythia.
PATTERN READ PATTERN SET
This menu item reads a previously saved pattern set into pythia. Pattern set files have the
extension .PAT
PATTERN IMPORT PATTERN
SET
This menu item imports a data file as a
pattern set into pythia. The data files
format is assumed to be ASCII.
After choosing a file name a dialog
box as shown in Picture 19 will pop
up. A preview window displays the
first lines of the file.
You can further specify a field
delimiter, the number of fields per
record and the number of outputs.
Picture 19 Import pattern file dialog

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

17 of 29 pages

Pythia tries to determine the field delimiter and the field per record value automatically.
Usually you wont have to change these.
The number of outputs field default to 1. You must change this if your data contain more
than one output.
Finally press the Ok button. Your new pattern set will be display on the left side of
Pythias main windows.
Note: During import some values will set automatically. These settings are the
Normalization parameters and the Display format.
Normalization:
The High of each input or output is set to the maximum value of the referred column.
The Low of each input or output is set to the minimum value of the referred column.
Display format:
The display for each column is set to (9,6), meaning the values will be displayed by a 9
character string with 6 characters after the decimal point.
PATTERN SAVE PATTERN SET
This menu item allows you to save the currently selected pattern set. Any special settings
like Normalization and Display format parameters are stored along with the pattern set.
PATTERN CLOSE PATTERN SET
This menu item closes the currently selected pattern set. The user will be prompted to
save this pattern set if it was modified.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

18 of 29 pages

PATTERN OPTIONS
This menu item allows you to change
certain settings for the pattern set.
Normalization:
A Neural Network expects any input and
output value to be between 0 and 1.
Therefore the pattern sets must be
normalized before being processed by
the network.
The normalization is calculated:
N(i) = (i - low) /
(high - low)
i is the input (or output value)
low is the minimum possible value
high is the maximum possible
value

The dialog box shown in Picture 20


allows you change the normalization
parameters.

Picture 20 Options for pattern sets

The initial parameters are calculated when the pattern set was first created, either by
importing or by pasting. They are set to the lowest and highest value found in a column.
The low..high range should span any possible value. However, if your data set contain
any runaways you should either exclude these from you data or ignore them by
manually adjusting the normalization parameters.
You can change values in the option window by simply entering the new value into the
cell.
Alternatively, you can make use of the following operations:
Push this button to specify a new low value for all columns.
Push this button to specify a new high value for all columns.
Restore the default values (new calculation as if the pattern set was just created).
Discard changes.
Note: If you train a network with a certain pattern set you must choose the same
normalization parameters for another pattern set you are going to reproduce with this
network.
The display format parameters only specifies how the number are displayed in the table.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

19 of 29 pages

Display format:
The display format parameters specify how the numbers are displayed in the table.
By default the display for each column is set to (9,6), meaning the values will be
displayed by a 9 character string with 6 characters after the decimal point.
Use the following operations to change the look of the numbers:
Push this button to specify a new width for all columns.
Push this button to specify a new decimal value for all columns.
Restore the default values (9,6).
Discard changes.
PATTERN TOGGLE VIEW
Select this command to toggle between the original and the normalized form of the
pattern set.
You recognize the currently select view on the tab of the pattern set. The notation
N(xxx.pat) means normalized, xxx.pat indicates native (original).

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

20 of 29 pages

NET CREATE NET


This menu item is used to create a new Neural Network.
Enter the new networks
desired topology into the
dialog box display in
Picture 21.
Begin with the number of
inputs the new network is
supposed to have.
Continue with the levels
(usually 2 or 3 levels).
Keep in mind that the last
described level is the
output level, meaning the
number of neurons in the
last level must match the
number of neurons in the
output level.
Pressing the Sample button
automatically creates a
sample topology that is
able to process the
currently selected pattern
set (if there is one).

Picture 21 Create a new Neural Network

Finally press the Ok button. The new created Neural Net is now displayed in the network
area of Pythias main window.
Picture 22 shows how this will look like.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

21 of 29 pages

Picture 22 A new created Neural Network

NET READ NET


This menu item reads a previously saved network into pythia. Network files have the
extension .NN
NET CLOSE NET
This menu item closes the currently selected network. The user will be prompted to save
this network if it was modified.
NET SAVE NET
This menu item allows you to save the currently selected network. The network topology
and weights are stored to a file with the extension .NN.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

22 of 29 pages

NET EVOLUTIONARY OPTIMIZER


The Evolutionary Optimizer is a tool for generating Neural Networks. The only thing you
must provide is the training pattern set. Selecting this menu item pops up a dialog box as
displayed in Picture 23.
How the Evolutionary Optimizer works
(the following brief description assumes parameters to be set as shown in Picture 23)
The Evolutionary Optimizer initially creates a generation containing 50 randomly created
networks.
Each network within this generation will be trained shortly and its fitness determined
according to the parameters in Goals to achieve.
Then a new generation of networks will be created from the old one according to the
following procedure:

Two parent networks will be chosen out of the old generation. The selection
algorithm will choose networks with a high fitness by a higher probability.
Two children networks will be created from the two parent networks.
With the probability of 0.2 the two children networks will be crossed over. This
means they will swap level with each other.
Example:
Child 1: (2,2,1)
(2,2 1)
Child 2: (2,6,6,1)
(2,6 6,1)
st
Compose the 1 part of child 1 with the 2nd part of child 2 to (2,2,6,1)
Compose the 1st part of child 2 with the 2nd part of child 1 to (2,6,1)
The children have been crossed over to (2,2,6,1) and (2,6,1).
The children now will be mutated with a probability of 0.04. Mutation means
insertion or deletion of a level, insertion or deletion of a neuron into a level or
change of weights.
The two parent network will now be checked if they belong to the 10 fittest
of the old generation. If they do they will be mutated and rolled over into the new
generation.

The selection continues until the new generation has 50 members too. After completion
the new generation will be evaluated.
The Evolutionary Optimizer is considered ready as soon as it found a network with a
fitness of 100.
Otherwise it continues for 1000 generations, what might take hours or days to compute.
You always can stop the process if you do not see any progress.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

23 of 29 pages

Options
There are lots of options you can finetune the optimizers work with.
1st ancestor Neural Net:
Specify a template network the first
generation is created from or allow to
generate a new random one with
create new.
Pattern set to learn:
Specify the pattern set you want to
generate a network for. This pattern set
must be loaded previously.
Check the option Mix pattern
randomly if you want to overcome a
given but unwanted sort order in the
pattern set. Note that a pattern sets
sort order influences the training phase
of the network.

Picture 23 The Evolutionary Optimizer

Goals to achieve:
Here you can specify what the network should be optimized for. There are three goals
possible:

Optimize for medium deviation ( deviation)


Optimize for max. deviation within the pattern set (*deviation)
Optimize for size (# neurons)

Tag or untag the checkboxes on the left side to determine if the certain goal shall play a
role at all.
If a goal is tagged specify the value you want to push the network below.
Specify the contribution a goal will make to the overall fitness.
The example in Picture 23 does not care about the medium deviation, but wants the max.
deviation to be below 0.25 and the network size below or equal 3. Both checked goals
will contribute 50% to the overall fitness of an evolutionary created network (1:1 =
50:50). If you would want the network size to have an importance of only 20% you

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

24 of 29 pages

would have to enter the value 4 and 1 into the contribution fields for max. deviation and
size.
Evolutionary algorithms settings:
Different options influence the optimization process

Population size: number of networks created per generation.

Evolution steps: max. number of generations before stop.

Mutation rate: probability for a network to be modified during rollover to a new


generation.

Cross over rate: probability for a child network to be crossed over with another
child network

# fittest/Generation: number of networks that are rolled over to a new generation


as they are. Any other networks of one generation are discarded. However,
discarded network still might become parent network.

Modify fittest: if checked, the rolled over networks are modified with the given
mutation rate.

Pressing the Ok button starts the evolutionary process. You can watch the progress in the
progress window shown in Picture 24
During optimization you always can stop the process by pressing the Stop button.
The colored circle right of the checkbox show you the quality of a network. Red means
no goal achieved at all, yellow means some goals achieved and green means all goals
achieved. If there is any green network the optimization terminates.

The column Topology describes the levels of the network.


The column Neurons describes the number of neurons the network owns.
The column dev shows the networks medium square deviation on pattern set
reproduction.
The column *dev shows the networks max square deviation on pattern set
reproduction.
The column Fitness describes the networks fitness with a number between 0
and 100.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

25 of 29 pages

Picture 24 The Evolutionary Optimizer generated a network

When the optimization stops (by either finding a network with a fitness of 100 or
stopping manually) you can choose one or more network to be move into the network
area of Pythias main window.
The moved networks are ready for use.
NET REPRO PATTERN SET
This menu item causes the currently selected network to reproduce the currently selected
pattern set. The net output columns and the deviation column will be refreshed.
Note: You can only choose this command if the currently selected network is compatible
to the currently selected pattern set. Compatibility means same number of inputs and
same number of outputs.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

26 of 29 pages

NET LEARN PATTERN SET


Choosing this menu item pops up a dialog box (Picture 25) that allows you to specify a
learn plan for the training of the currently selected network with the currently selected
pattern set.
Options
Neuronal Network to train:
Specify the network you want to train.
This network must be compatible with
the pattern set and must have been loaded
into the network area of the main
window.
Pattern set to learn:
Specify the pattern set you want to train
the network with. This pattern set must
be loaded previously.
Check the option Mix pattern randomly
if you want to overcome a given but
unwanted sort order in the pattern set.
Note that a pattern sets sort order
influences the training of the network.
Set weights randomly:

Picture 25 Set up a learn plan

Check this box to set the networks weights to initial random values. This might be
necessary because sometimes the given weight values do not lead to a suitable network
even if this is possible.
Train until:
Declare the cancel criteria of the learn plan here. You can connect the criteria with each
other by the AND or the OR operator.
The criteria for exiting the learn plan are:

Repetition
Medium deviation ( deviation)
Max. deviation within the pattern set (*deviation)
Time passed

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

27 of 29 pages

Tag or untag the checkboxes on the left side to determine if the certain criterion shall play
a role at all.
In the example in Picture 25 the learn plan will stop as soon as
either the repetition count reaches 1000
or the max deviation is below 0.25
or 300 seconds have passed.
Learn rate:
The learn rate specifies how fast an error is propagated backwards. A large value
accelerates the learning, but might cause the network to overshoot the mark. Choose a
value between 0.1 and 0.5.
Tag Automatically adjust if you want Pythia to adjust the learn rate during the training
automatically. (Note: this feature is not implemented yet).
Finally:
Tag Reproduce pattern set if you want to get the pattern set reproduced after the training.
This is equivalent with the command NET REPRO PATTERN SET.
Check show results in native form to get the pattern set shown in their original form
instead of the normalized form. This is equivalent with the command PATTERN
TOGGLE VIEW.
NET REPRODUCE SINGLE
This menu item causes the currently
selected network to reproduce a
single pattern.
Manually enter the inputs, delimited
by a , .
The number of data you need to
enter equals the number of inputs of
the network.
Furthermore, you need to check if
the data entered will be interpreted
normalized or original.

Picture 26 Reproduce a single pattern

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

28 of 29 pages

The result of this operation will be displayed in


Pythias log window on the bottom of the main
window.
Picture 27 Result of the single
pattern reproduction

NET LEARN SINGLE


This menu item causes the currently
selected network to perform one
learn step for a single pattern.
Manually enter the inputs and
outputs, delimited by a , .
The number of data you need to
enter equals the number of inputs
plus the number of outputs of the
network.

Picture 28 Perform one learn step for a single


pattern

Furthermore, you need to check if the data


entered will be interpreted normalized or
original.
The result of this operation will be displayed
in Pythias log window on the bottom of the
main window.

Picture 29 Result of a single learn


step

NET SET RANDOM WEIGHTS


Choose this menu item to reset a networks
weights to random values. This might be
necessary because sometimes the given
weight values do not lead to a suitable
network even if this is possible.
Picture 30 Set new random weights
You need to enter the low..high range for the
new random weights. These are usually between 1 and 1.

Pythia The Neural Network Designer 2000 by Runtime Software

- 11/28/15 - 6:44 PM

29 of 29 pages

NET SET LEARN RATE


Choose this menu item to change a networks learn rate.
The learn rate specifies how fast an error is propagated backwards. A large value
accelerates the learning, but might cause the network to overshoot the mark. Choose a
value between 0.1 and 0.5.
EDIT COPY CELLS
This command copies a pattern sets selected cells into the windows clipboard. From
there you can paste the data into other applications, e.g. text editors or MS Excel.
EDIT PASTE CELLS
This command pastes cells from the
Windows clipboard into Pythia.
You will get a dialog box as shown in
Picture 31.
Field delimiter:
This specifies the delimiter between
single fields.
Fields per record:
Specifies the number of fields in each
record.
Paste to new Pattern Set:
Select this options if you intend to
Picture 31 Paste dialog
paste the clipboard into a whole new
pattern set. You must specify the number of inputs too.
Paste to selected Pattern Set:
Select this option if you want to paste the clipboard into an existing pattern set. You can
either paste at the current position or append.
Finally press the Ok button.

You might also like