# ANN Simulink Examples (For ECE662)

In class, we discussed Artificial Feed-forward Neural Network theory in detail. I found a website that may help other students get a jump start on using Simulink to develop either supervised or unsupervised classifiers for their research. See: this Lab Exercise.

DISCLAIMER I have not used these techniques for an actual project, but the NN Toolbox looks very promising. I just think it's neat. --mreeder

## Supervised Learning

The downside to any ANN is a long training time. In this case, 80 "epochs" or training runs by default. For simpler nonlinear systems, this might not be an issue. The following MATLAB code uses the Neural Network Toolbox to train a 1 dimensional feature space from the umu.se link, above.

% Supervised learning Example   file name: bp_ex1.m  Date:01-10-09.
% Here is a problem consisting of inputs P and targets T that we would
% like to solve with a network.
P = [0 1 2 3 4 5 6 7 8 9 10];
T = [0 1 2 3 4 3 2 1 2 3 4];
% Here a two-layer feed-forward network is created.  The network's
% input ranges from [0 to 10].  The first layer has six TANSIG
% neurons, the second layer has one PURELIN neuron.  The TRAINLM
% network training function is to be used.
net = newff([0 10],[6 1],{'tansig' 'purelin'});
%Here the network is simulated and its output plotted against
%the targets.
Y = sim(net,P);
figure(1);
plot(P,T,'+',P,Y,'o'); title('network before training, + is teacher');
% Here the network is trained for 80 epochs.  Again the network's
% output is plotted.
net.trainParam.epochs = 80;
net = train(net,P,T);
Y = sim(net,P);
figure(2)
plot(P,T,'+',P,Y,'o'); title('Supervised output (+), Network output (o)');
xlabel('x');ylabel('y');
whos           % Which varibles do I have?
net            % Gives structure information of trained Net
input_weights = net.iw{1}
input_bias = net.b{1}
layer_weights = net.lw{2}
layer_bias = net.b{2}

## Unsupervised Learning Example

The Self-Organizing Map method for unsupervised learning is ideal for complex systems with multiple inputs, multiple outputs, and only 2 dimensions. Example 2 from the link, above.

% SOM Example  2-dim File name: som2d_1.m Date:01-10-09
% NEWSOM   - Creates a self-organizing map.
% TRAIN  - Trains a neural network.
% SIM   - Simulates a neural network.
% This self-organizing map will learn to represent different regions of the input space where
% input vectors occur.  In this demo, however, the neurons will arrange themselves in a two-
% dimensional grid, rather than a line. We would like to classify 1000 two-element vectors
% occuring in a rectangular shaped vector space.
P = rand(2,1000);       % here input is random (0 to 1)
% We will use a 5 by 6 layer of neurons to classify the vectors above. We would like each
% neuron to respond to a different region of the rectangle, and neighboring neurons to
% respond to adjacent regions.  We create a layer of 30 neurons spread out in a 5 by 6 grid:
net=newsom([0 1; 0 1],[5 6]);
% Initially all the neurons have the same weights in the middle of the vectors,
% so only one dot appears.;
% Now we train the map on the 1000 vectors for 1 epoch and replot the network weights.
% Please wait for it. Blue lines show neurons with distance =1.
net.trainParam.epochs=2;
net.trainParam.show=1;
net=train(net,P);
plotsom(net.iw{1,1},net.layers{1}.distances)
%Note that the layer of neurons has begun to self-organize so that each neuron now
% classifies a different region of the input space, and adjacent (connected)
% neurons respond to adjacent regions.
% We can now use SIM to classify vectors by giving them to the network and seeing
% which neuron responds.
p = [0.5; 0.3];
a=0;
a = sim(net,p)
% The neuron in parenthesis above responded with a "1", so p belongs to that class.

## Mathworks SOM Example

For better documentation:

>>help plotsomhits

<p> This example show a more Non-Linear classifier, which still outputs classes in an intuitive visual display.

load iris_dataset
net = newsom(irisInputs,[5 5]);
[net,tr] = train(net,irisInputs);
plotsomhits(net,irisInputs);

## Real World, Real Time Motivation

In 1991, NASA used a 3-input ANN to help diagnose catastrophic failures in the Space Shuttle's Main Engine (a 7,000 lb liquid Hydrogen/Oxygen rocket). For the first six seconds of operation, the network was trained to detect failures when the engine was fired, and trigger a shutdown when required. Paper link: here.

Back to 2010 Spring ECE 662 mboutin

## Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood