NETS Neural Network Software

The NETS software was developed by NASA to assist with the development of neural network applications. NETS uses the generalized delta back propagation learning method.

NETS can be started on the UNIX machines within AGEN by typing "nets" at the prompt. The section below shows the result of starting the NETS software. You will note, the software is menu driven.


 NASA-JSC Software Technology Branch
 NETS Back Propagation Simulator Version 3.0

 b -- show bias values
 c -- create a net
 d -- set dribble parameters
 e -- display error stats for a file
 g -- generate delivery code
 i -- setup I/O pairs
 j -- reset I/O pairs
 l -- change learning rate
 m -- print this menu
 n -- show net configuration
 o -- reorder I/O pairs for training
 p -- propagate an input through the net
 r -- reset weights from a file
 s -- save weights to a file
 t -- teach the net
 u -- scale I/O pairs
 v -- scale a data file
 w -- show weights between two layers 
 q -- quit program
 
NETS Choice(m = menu)?

The following figure shows an example schematic of a net that one might create with NETS.

To create the above neural network using nets, the "c" option is selected from the nets menu. You will then be asked to provide a file name that describes the net configuration. The configuration file is a simple ascii file that describes the number of layers, nodes, and the connections of nodes. The configuration file for the example in the figure above is shown below.

LAYER : 0    --the input layer 
NODES : 3
 TARGET : 2  -- 2 is the hidden layer number and is the
             -- target for connections from layer 0 
  
LAYER : 1    --the output layer  
NODES : 2 
 
LAYER : 2    -- the hidden layer 
NODES : 4 
 TARGET : 1  -- layer 2 is connected to the output layer 

Note that LAYER 0 is always the input layer and LAYER 1 is always the output layer. Also the spaces around the colons are significant. The above net configuration file can be created with vi or other tools that allow you to create an ascii file.

Below, the results of selecting the create net option of nets are shown.

NETS Choice(m = menu)? c
 
   Enter filename with net configuration: test.net
 
   Enter maximum weight value(default =  0.250): 

 
   Enter minimum weight value(default =  0.001):  
   Use a global learning rate (y/n, default=y)? 
 
   Enter global learning rate (default=  0.400): 
 
   Use a global momentum (y/n, default=y)? 
 
   Enter global momentum (default=  0.000): 
 
   Use biases in network(y/n, default=y)? 
The global learning rate and global momentum variables are used by the training law. The values of these variables can be adjusted to assist with training.

To better understand how one would use nets to solve a problem, lets explore a simple problem. The XOR (eXclusive OR) problem is a classical problem that neural networks can solve. The problem is to create a neural net that can simulate an XOR logic gate. It should take two inputs and produce a single output. A truth table for the XOR problem which maps the inputs to the output is shown below.

Input 1   Input 2   Output
  0         0         0
  0         1         1
  1         0         1
  1         1         0
The two inputs (Input 1 and Input 2) would be placed on the input layer and the corresponding value of the right column would be expected as the output. Therefore, each row of the above table corresponds to one input/output pair for the XOR problem. A nets network configuration file for this problem would look like:
LAYER : 0
NODES : 2
  TARGET : 2

LAYER : 1
NODES : 1

LAYER : 2
NODES : 5
  TARGET : 1
Note that the number of node in the input and output layers are determined by the input and output specifications. The number of nodes in the hidden layer is often determined by trial and error. A good starting number is often approximately twice the number of nodes of the input layer.

In this example, two inputs in layer 0 are linked to a hidden layer with 5 nodes and the hidden layer is linked to the output layer which consists of a single node. Before creating our neural net, we must still create a "training" file that will contain the input/output pairs which will be used to train the neural network. Such files are generally given an .iop extension and are simply ascii files with the input/output pairs as shown below.

(.1 .1 .1)
(.1 .9 .9)
(.9 .1 .9)
(.9 .9 .1)
The input/output pairs are enclosed in parenthesis and any comments are preceded by a double dash (--) as in the network specification file. There are 2 important things to notice about these pairs. First, note that the first value of each pair is meant to represent the value for input 1, the second value represents the value for input 2, and the third value represents the output. Second, rather than using 1's and 0's which are typical of a binary encoding for the problem, the values used are ".1" for 0 and ".9" for 1. Since the network can accept any values between 0 and 1, these are perfectly good numbers. Furthermore, due to a subtlety of the learning algorithm, the extreme values of the allowable range (0 and 1) are unreachable. Consequently the best results are usually obtained by using .1 and .9 for 0 and 1. It is also important to note that the greater the difference between the values, the easier it will be for the network to produce results that fall within the desired error constraint.

The following shows the loading of the network configuration file for the xor problem.

NETS Choice(m = menu)? c
 
   Enter filename with net configuration: xor.net
 
   Enter maximum weight value(default =  0.224): 
 
   Enter minimum weight value(default =  0.001): 
 
   Use a global learning rate (y/n, default=y)? 
 
   Enter global learning rate (default=  0.400): 
 
   Use a global momentum (y/n, default=y)? 
 
   Enter global momentum (default=  0.000): 
 
   Use biases in network(y/n, default=y)?

Once the network configuration file is loaded, the training input/output pairs are loaded from a file as show below.

NETS Choice(m = menu)? i
 
   Enter name of file containing I/O pairs  (default=xor.iop): xor.iop
 
*** 4 I/O Pairs read ***
 Do you wish to scale these I/O pairs? n

Once the I/O pairs are loaded, you are now ready to "train" the net. In some instances, it is desirable to rearrange the I/O pairs into a random order. The reorder I/O pairs option will do this. The following shows the results of selecting the teach the net option.

NETS Choice(m = menu)? t
 
   Enter constraint error: .2
 
   Enter max number of cycles(default=10000): 100
 
   Enter cycle increment for showing errors (default=1): 3

The constraint error tells the network how small the error between the output training data (the "truth") and the estimated output for a given set of input should be before the network stops learning. In this example, this error is set to 0.2. In many instances, it may take thousands of cycles (presentations of the I/O pairs to the network and adjustment of weights) before the network learns. Thus a maximum number of cycles is also set as a stopping criteria. In this case, the maximum number of cycles was set to 100. The net will stop training once the constraint error is less than 0.2 or once 100 training cycles have occurred - whichever is first.

The net provides information about how training is progressing if one desires. In this case, information has been requested after every 3 cycles. Information that is generated is shown below.

*** Learning; Press break key to suspend ***
 
Cycle : 3   Max error:  0.424    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 6   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 9   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 12   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 15   Max error:  0.422    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 18   Max error:  0.422    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 21   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 24   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 27   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 30   Max error:  0.423    RMS error:  0.411 
   training time =     0.1 seconds
 
Cycle : 33   Max error:  0.424    RMS error:  0.411 
   training time =     0.1 seconds

.....

Cycle : 96   Max error:  0.424    RMS error:  0.411 
   training time =     0.4 seconds
 
Cycle : 99   Max error:  0.424    RMS error:  0.411 
   training time =     0.5 seconds
 
*** Timeout failure ***
 *** Net could not learn after 100 tries ***
 Elapsed training time =     0.5 seconds
 
NETS Choice(m = menu)? 
Note that the network was unable to "learn" are I/O pairs after 100 cycles. For some problems, a network may never be able to learn the data because of inconsistencies in the I/O pairs or for other reasons. In some cases the net must run for millions of training cycles before the data is learned.

We might let the net try learning by continuing training from here. If we do so and let it run for 10000 training cycles, we obtain the following:

Cycle : 1100   Max error:  0.423    RMS error:  0.410 
   training time =     1.9 seconds
 
Cycle : 2100   Max error:  0.423    RMS error:  0.410 
   training time =     3.7 seconds
 
Cycle : 3100   Max error:  0.423    RMS error:  0.410 
   training time =     5.4 seconds
 
Cycle : 4100   Max error:  0.423    RMS error:  0.410 
   training time =     7.1 seconds
 
Cycle : 5100   Max error:  0.423    RMS error:  0.410 
   training time =     8.8 seconds
 
Cycle : 6100   Max error:  0.423    RMS error:  0.410 
   training time =    10.6 seconds
 
Cycle : 7100   Max error:  0.423    RMS error:  0.410 
   training time =    12.4 seconds
 
Cycle : 8100   Max error:  0.423    RMS error:  0.410 
   training time =    14.1 seconds
 
Cycle : 9100   Max error:  0.423    RMS error:  0.410 
   training time =    16.2 seconds
 
Cycle : 10100   Max error:  0.423    RMS error:  0.410 
   training time =    18.1 seconds
 
*** Timeout failure ***
 *** Net could not learn after 10100 tries ***
 Elapsed training time =    18.1 seconds
Our net has become stuck in a local minima and is not learning our data. To overcome this problem, we can adjust the momentum and/or learning variables. Lets adjust the momentum variable as shown below.
NETS Choice(m = menu)? l
 
   Layer number: 1
 
     Enter learning rate for layer 1  (default=  0.400): 
 
     Enter scaling factor or 0 if not desired  (default=  0.000): 
 
     Enter momentum for layer 1  (default=  0.000): .3
We can now continue training from this point. If we do so, we obtain the following results:
NETS Choice(m = menu)? t
 
   Enter constraint error: .2
 
   Enter max number of cycles(default=10000): 
 
   Enter cycle increment for showing errors (default=1): 1000
 
*** Learning; Press break key to suspend ***
 
Cycle : 11100   Max error:  0.419    RMS error:  0.410 
   training time =    19.9 seconds
 
Cycle : 12100   Max error:  0.421    RMS error:  0.407 
   training time =    21.8 seconds
 
Cycle : 13100   Max error:  0.459    RMS error:  0.318 
   training time =    23.7 seconds
 
NETS Choice(m = menu)? 
The net has learned the data and has stopped. Let's test the net as shown below by entering a 0 for input 1 and 0 for input 2.
NETS Choice(m = menu)? p
 
   Enter filename with INPUT(default=manual entry): 
 
   Enter OUTPUT destination file(default=screen): 
 
   Enter input for node 0: 0
 
   Enter input for node 1: 0
 

The Outputs for Input 1 are:
    for node 0 the value   0.141
Note the net has correctly predicted a value of 0 (.141). In most cases values less than .5 or some other threshold are interpreted as 0. We could save the weights between the processing elements (neurons) in a file that could be recalled and used in the future and thus we would not have to train the net.