3rd/4th Year UG and MSc

(Introduction to) Neural Computation

Course Material and Useful Links

Peter Tino

P.Tino@cs.bham.ac.uk




Lecture Timetable and Handouts

Here is a preliminary outline of the module structure and lecture timetable. I will develop most of the ideas on the blackboard. You are encouraged to take notes during the lectures. Any handouts used will be made available here as pdf files shortly after the paper versions have been distributed. Some knowldege of maths is assumed, but more complicated notions will always be explained during the lectures. However, your maths should be better than this.

Week Session 1
Mondays 9:00-10:00
Session 2
Fridays 13:00-14:00
1 Module Administration. Introduction to Neural Networks and their History. Biological Neurons and Neural Networks. Artificial Neurons. [PDF]
2 Networks of Artificial Neurons. Single Layer Perceptrons. Learning and Generalization in Single Layer Perceptrons.
3 Perceptron: Training and convergence. [DEMO] Multi-Layer Perceptrons.
4 Feed-forward networks as a function class. Gradient-based optimization. [PDF]
5 Training by Back-Propagation of errors. Variants and modifications of BackProp.
6 Self-Organising Maps. [PDF] (John Bullinaria) Learning Vector Quantisation. [PDF] (John Bullinaria)
7 Neural network applications. [PDF] [PDF] Statistical perspective on learning.
8 Bias-variance dilemma. [PDF] Dealing with the bias-variance dilemma.
9 Ensembles of Neural Networks. [PDF] Negative correlation in ensembles of neural networks. [PDF]
10 Radial basis function networks. Generalized Radial basis function networks, kernel machines.
11 Recurrent neural networks. Back-Propagation through time, Real-Time recurrent learning.

12 Two Revision Lectures Covering the Whole Module




Suggested reading




Try the models out! - Benchmark data sets

As you get familiar with different types of neural network models, try them out on benchmark data sets people in the machine learning community have been using to support their claims about yet another excellent learning system :-)

Here are two of the widely used data repisitories that contain data description, data itself and other useful things, like previously obtained results.

  • DELVE - Data for Evaluating Learning in Valid Experiments
  • UCI Knowledge Discovery in Databases Archive




    Linear models on autoassociative tasks

    Here is a simple example of Generalized Inverse and Correlation matrix memory models.

    Take a look at the 8 faces (taken from CMU image data repository within the UCI archive) forming the training data. The last face (lower right corner) is a corrupt version of the first face (upper left corner).

    The recall by GI and CMM models as well as novelty detection from GI can be found here.
    GI is a more complicated model, but gives better results.

    What about the new input pattern at the upper left corner of this figure? Anything like it is present in the training corpus, so both models get confused. Note the multiple "ghost" faces in the recall of GI. Novelty detection, however, clearly detects what is wrong with this input.

    Data and MATLAB code for producing these pictures are collected in the gzip/tar-ed file. Run gi_cmm_faces.m.




    Assignment for those taking "Introduction to Neural Computation" (extended deadline: Thursday 24th Jan 2008, noon)

    First, implement a feed-forward neural network trained by BackPropagation. Alternatively, make yourself familiar with my implementation in C. Un-tar the file bp.code.tar.gz, go to the folder "BP.CODE" and consult "read.me". There is an example data set in the directory "IONOSPHERE". Preprocessed training inputs and desired outputs (targets) are in files "ionosphere.trn.in" and "ionosphere.trn.nn.t", respectively. Test inputs and targets are in "ionosphere.tst.in" and "ionosphere.tst.nn.t", respectively. Read more about the data set in the "ionosphere.info.txt" file. You can also use one of many neural network simulators available on the web. See section Simulators and Code below.

    You can work with the ionosphere data, or choose one of three data sets filed in data.tar.gz. Read the information files (*.info.txt) in the "DATA/IRIS", "DATA/WINE" and "DATA/TIC-TAC-TOE" folders. Alternatively, you can pick any reasonably complex data set of your own choice (e.g. from the web) or making. Please consult me if you decide to go for the last option.

    Train and test the networks with

  • batch and on-line learning (worth 10%)
  • different numbers of hidden units (worth 10%)
  • different activation functions, e.g. logistic sigmoid, tanh function, etc. (don't forget to change the targets accordingly!) (worth 10%)
  • different values of learning and momentum rates (worth 10%)
  • at least one strategy of varying learning rate during the course of learning. It can be any strategy you can find in the literature, or your own strategy (highly encouraged), provided you can justify it (at least in intuitive terms). (worth 20%)
  • Report the results in a statistically organized manner, i.e. compute simple statistics of performance measures (mean and standard deviation) accross multiple runs of network training/testing. (worth 20%)

    Estimate the test error for some of the networks on the original training set using the n-fold cross-validation strategy. Compare with the test error computed on the original test set. Repeat the experiment for n = 5,10,N, where N is the size of the original training set (leave-one-out cross-validation). (worth 20%)

    The overall mark m (in the range 0-100%) will be linearly scaled to the range 0-20% by 0.2*m.

    Example reports from the past:
    - report by Kennon Ballou
    - report by Andrew Brown
    - report by Michael Nashvili
    Kennon, Andrew, Michael - Thank you so much!




    Preparing for the exam - Sample Questions




    Aims, Objectives, and Assessment

    For formal details about the aims and objectives and assessment you should look at the official Module Syllabus Page [Neural Comp] [Intro to Neural Comp].

    Neural Comp. students will be assessed by examination (100%).
    Intro to Neural Comp. students will be assessed based on examination (80%) and a continuous assessment by mini-project report (20%).

    As the material is developed I will give you ideas of the standard and type of questions you can expect in this year's examination. I will address questions related to the material covered in previous lectures in great detail during the timetabled Exercise Sessions.




    Recommended Books and Links

    The Recommended Books for this module are:

    Title Author(s) Publisher, Date Comments
    An Introduction to Neural Networks Kevin Gurney Routledge, 1997 Non-mathematical introduction.
    Neural Networks: A Comprehensive Foundation Simon Haykin Prentice Hall, 1999 Very comprehensive, a bit heavy in maths.
    Neural Networks for Pattern Recognition Christopher Bishop Clarendon Press, Oxford, 1995 Highly recommended for mathematically minded students.
    Introduction to Neural Networks R. Beale & T. Jackson IOP Publishing, 1990 Introductory text.
    An Introduction to the Theory of Neural Computation J. Hertz, A. Krogh & R.G. Palmer Addison Wesley, 1991 Good all round book. Slightly mathematical.
    Neural Networks - Notes I F Wilde, King's College London Publically available at http://www.mth.kcl.ac.uk/~iwilde/notes/nn/ [PDF] An excellent set of notes. Highly recommended for mathematically minded students.

    If you have a reasonably mathematical background and you can only afford to buy one book for this module, I would recommend getting the one by Haykin.

    If you want to find information about Neural Networks on the web, the best place to start is: The Neural Networks FAQ site which contains a large range of information and links about all aspects of neural networks.

    A comprehensive information about resources/activities related to Neural Networks are at: Neural Network Resources.

    When programming your own neural networks, it may be useful to consult the web page created by John Bullinaria giving a Step by Step Guide to Implementing a Simple Neural Network in C. It should be fairly straightforward to see how to use it with related programming languages such as C++ and Java.




    Simulators and Code

    Web Sim - Java neural network simulator.
    Brainwave - a Java based simulator
    tlearn - Windows, Macintosh and Unix implentation of backprop and variants. Written in C.
    PDP++ - C++ software with every conceivable bell and whistle. Unix only. The manual also makes a good tutorial.


    This page is maintained by Peter Tino. Last updated on 9 May 2007.