Here's an outline of the module structure and lecture timetable. All the module handouts were made available here as pdf files shortly after the paper versions were distributed in the lectures.

Week |
Session 1 Wednesdays 12:00-13:00 |
Session 2 Thursdays 10:00-11:00 |
---|---|---|

1 | Introduction to Neural Networks and their History. [pdf] | Biological Neurons and Neural Networks. Artificial Neurons. [pdf] |

2 | Networks of Artificial Neurons. Single Layer Perceptrons. [pdf] | Learning and Generalization in Single Layer Perceptrons. [pdf] |

3 | Hebbian Learning. Gradient Descent Learning. [pdf] | The Generalized Delta Rule. Practical Considerations. [pdf] |

4 | Learning in Multi-Layer Perceptrons. Back-Propagation. [pdf] | Learning with Momentum. Conjugate Gradient Learning. [pdf] |

5 | Bias and Variance. Under-Fitting and Over-Fitting. [pdf] | Improving Generalization. [pdf] |

6 | Applications of Multi-Layer Perceptrons. [pdf] | Exercise Session 1 |

7 | Radial Basis Function Networks: Introduction. [pdf] | Radial Basis Function Networks: Algorithms. [pdf] |

8 | Radial Basis Function Networks: Applications. [pdf] | Committee Machines. [pdf] |

9 | Exercise Session 2 | Self Organizing Maps: Fundamentals. [pdf] |

10 | Self Organizing Maps: Algorithms and Applications. [pdf] | Learning Vector Quantization (LVQ). [pdf] |

11 | Overview of More Advanced Topics. [pdf] | Exercise Session 3 |

12 | Two Revision Lectures Covering the Whole Module [pdf] |

For formal details about the aims, learning outcomes and assessment you should look at the official Module Description Page and Syllabus Page.

There are two components to the assessment of this module: A two hour examination (70%) and a continuous assessment by mini-project report (30%). (NB The module description says "resit by written examination only with the continuous assessment mark carried forward", so resit students DO NOT do this year's continuous assessment mini-project!)

A series of exercise sheets, largely based on recent past examination questions, will give an idea of the standard and type of questions you can expect in this year's examination. These will be distributed when the associated material has been covered in the lectures. They do not contribute to the assessment for the module. The Exercise Sessions will be used to talk through appropriate answers to the questions on the Exercise Sheets. They have now all been distributed: Exercise Sheet 1, Exercise Sheet 2, Exercise Sheet 3, Exercise Sheet 4 and Exercise Sheet 5.

I have also produced an Equation Sheet which contains the main equations that you should be familiar with from the module.

The Continuous Assessment Assignment was distributed and discussed in Exercise Session 1.

The objective of this exercise is for you to gain practical experience in setting up, training and optimising a neural network designed to recover the underlying function from a set of noisy training data

The data sets for the project are generated individually for each student: Download the data generation program called datagen from here (using the right mouse button usually). It is compiled to run on the School's Linux PCs. Enter 'chmod 700 datagen' on the linux prompt to make it an executable, and then run it to generate the data.

You may use any software you wish to run the specified neural network simulations, but we have
installed the *javaNNS* simulator on the School's system for use on the Linux PCs.
If you have not used this software before, you may want to begin by taking a look at my
Getting Started with javaNNS page,
and then at my Quick Guide to javaNNS.
Beyond that, there is a fairly comprehensive on-line help system to guide you.

This package is also good for exploring the other aspects of neural networks that are discussed in this module.

For those of you who might be interested in programming your own neural networks, rather than using *javaNNS*,
I've written a web page giving a Step by Step Guide
to Implementing a Simple Neural Network in C which will get you started. It should be fairly
straightforward to see how to use it with related programming languages such as C++ and Java.

The Recommended Books for this module are:

Title | Author(s) | Publisher, Date | Comments |
---|---|---|---|

An Introduction to Neural Networks | Kevin Gurney | Routledge, 1997 | Non-mathematical introduction. |

Neural Networks: A Comprehensive Foundation | Simon Haykin | Prentice Hall, 1999 | Very comprehensive and up-to-date, but heavy in maths. |

Neural Networks for Pattern Recognition | Christopher Bishop | Clarendon Press, Oxford, 1995 | This is the book I always use. |

Fundamentals of Neural Networks | Laurene Fausett | Prentice Hall, 1994 | Good intermediate text. |

The Essence of Neural Networks | Robert Callan | Prentice Hall Europe, 1999 | Worth reading. |

Introduction to Neural Networks | R. Beale & T. Jackson | IOP Publishing, 1990 | Introductory text. |

An Introduction to the Theory of Neural Computation | J. Hertz, A. Krogh & R.G. Palmer | Addison Wesley, 1991 | Good all round book. Slightly mathematical. |

Principles of Neurocomputing for Science and Engineering | F. M. Ham & I. Kostanic | McGraw Hill, 2001 | Good advanced book, but rather mathematical. |

If you can only afford to buy one book for this module, I would recommend getting the one by Haykin if you have a resonably mathematical background, or the one by Gurney if you don't.

If you want to find online information about Neural Networks, probably the best places to start are:
The Neural Networks FAQ web-site, and the
Neural Network Resources web-site,
both of which contain a large range of information and links about all aspects of neural networks.