Learning in Multilayer Networks
This page contains a set of exercises which you can carry out to
improve your understanding of multilayer networks. You need to use the
SNNS neural network simulator to do them. There are some tutorials on
SNNS. You should work through those first.
These exercises will help your understanding in several ways. In
particular they will help you develop your understanding of how
multilayer networks behave; how they represent functions; and what
they can, and can't do. They should help you develop a deep
understanding of networks at the microscopic level.
You should have read the SNNS tutorials before attempting these
Use SNNS to construct a single layer network of
sigmoid units suitable to represent a problem with 2 inputs and 1
output. Use the network with the delta rule and apply them to learning
a linearly separable classification problem. Some example data is
given in /bham/ums/common/pd/packages/snns/local/exercises/Q1.pat.
Analyse and discuss the
network's learning behaviour on the basis of your observations.
Now try and apply your network with the delta rule to the XOR
problem. What behaviour does the network exhibit? How do the error and
weights change through time?
Construct a layered feedforward network with two hidden
units and one output unit. Apply this with the standard
backpropagation algorithm (the generalised delta rule) to solving the
XOR problem. Try increasing the numbers of hidden units. Try changing
the learning rate too. How do these changes affect the convergence
time? Explain why the weights get larger and larger. Suggest how you
might prevent this.
The aim of this exercise is to investigate whether it is possible to
solve the parity problem using a multilayer network. Setup a network
with 5 input units, 5 hidden units and 1 output unit. The training set
for the 5 bit problem is available as /bham/ums/common/pd/packages/snns/local/exercises/parity.pat.
The parity problem is
simply to determine whether there is an even or odd number of inputs
on. If there are an even number the network outputs a zero. If there
are an odd number it outputs a one. Show that your network can
learn. After the network has learned analyse the weights and explain
how the trained network encodes the function. Do you have more success
if you increase the number of hidden units?