Introduction to Neural Computation (Level 4/M)

Neural Computation (Level 3/H)

Dr John A. Bullinaria

j.a.bullinaria@cs.bham.ac.uk


This page is the main source of information for these modules. From it you will eventually be able to obtain all the slides, handouts, and exercise sheets used in the lectures, details about the continuous assessment and examination, and so on.


Module Outline

This module introduces the basic concepts and techniques of neural computation, and its relation to automated learning in computing machines more generally. It covers the main types of formal neuron and their relation to neurobiology, showing how to construct large neural networks and study their learning and generalization abilities in the context of practical applications. The Level 4/M version also provides practical experience of designing and implementing a neural network for a real world application.

Lecture Timetable and Handouts

Here's an outline of the module structure and lecture timetable. All the module handouts will be made available here as pdf files shortly before the paper versions are distributed in the lectures. Any spare paper copies will be deposited in the School library. I'll leave last year's notes in place, marked "[2013pdf]", until they are replaced by this year's notes, marked "[pdf]".

Week Lecture 1
Tuesdays 17:00-18:00
Lecture 2
Thursdays 17:00-18:00
1 Introduction to Neural Networks and their History [pdf] Biological Neurons and Neural Networks, Artificial Neurons [pdf]
2 Networks of Artificial Neurons, Single Layer Perceptrons [pdf] Learning and Generalization in Single Layer Perceptrons [pdf]
3 Hebbian Learning, Gradient Descent Learning [pdf] The Generalized Delta Rule, Practical Considerations [pdf]
4 Learning in Multi-Layer Perceptrons - Back-Propagation [pdf] Learning with Momentum, Conjugate Gradient Learning [pdf]
5 Bias and Variance - Under-Fitting and Over-Fitting [pdf] Improving Generalization [pdf]
6 Applications of Multi-Layer Perceptrons [pdf] Exercise Session
7 Recurrent Neural Networks [2013pdf] Radial Basis Function Networks: Introduction [2013pdf]
8 Radial Basis Function Networks: Algorithms [2013pdf] Radial Basis Function Networks: Applications [2013pdf]
9 Self Organizing Maps: Fundamentals [2013pdf] Self Organizing Maps: Properties and Applications [2013pdf]
10 Learning Vector Quantization [2013pdf] Committee Machines [2013pdf]
11 Model Selection and Evolutionary Optimization [2013pdf] Exercise Session

12 Revision Lecture Covering the Whole Module [2013pdf]

Aims, Learning Outcomes and Assessment

For formal details about the aims, learning outcomes and assessment you should look at the official Module Description (Level 4/M or Level 3/H) and Syllabus (Level 4/M or Level 3/H).

The Level 3 module Neural Computation is assessed by 100% Examination.

The Level 4 module Introduction Neural Computation is assessed by 80% Examination and 20% Continuous Assessment.

In both cases the examination will be closed book, and you will be expected to answer all four questions which will each be worth 25% of the total.

A series of Exercise Sheets, largely based on recent examination questions, will give an idea of the standard and type of questions you can expect in this year's examination. These will be distributed when the associated material has been covered in the lectures. They do not contribute to the assessment for the module. They are designed to help you monitor your progress on the module - try to answer the questions without your notes, and then use your notes to see whether your answers are correct. The Exercise Sessions will be used to talk through answers to any questions you have difficulty with. So far, the first three have been distributed: Exercise Sheet 1, Exercise Sheet 2 and Exercise Sheet 3,

Recommended Books and Links

The Recommended Books for this module are:

Title Author(s) Publisher, Date Comments
Neural Networks and Learning Machines Simon Haykin Pearson, 2009 Very comprehensive, but heavy in mathematics.
Neural Networks: A Comprehensive Foundation Simon Haykin Prentice Hall, 1999 Older edition of the above book, but still covers the whole module.
Neural Networks for Pattern Recognition Christopher Bishop Clarendon Press, Oxford, 1995 This is the book I always use, but it doesn't cover the whole module.
The Essence of Neural Networks Robert Callan Prentice Hall Europe, 1999 Concise introductory text.
An Introduction to Neural Networks Kevin Gurney Routledge, 1997 Non-mathematical introduction.
Fundamentals of Neural Networks Laurene Fausett Prentice Hall, 1994 Good intermediate text.
Introduction to Neural Networks R. Beale & T. Jackson IOP Publishing, 1990 Introductory text.
An Introduction to the Theory of Neural Computation J. Hertz, A. Krogh & R.G. Palmer Addison Wesley, 1991 Good all round book. Slightly mathematical.
Principles of Neurocomputing for Science and Engineering F. M. Ham & I. Kostanic McGraw Hill, 2001 Good advanced book, but rather mathematical.

If you can only afford to buy one book for this module, I would recommend getting either of the Haykin books.

If you want to find online information about Neural Networks, probably the best places to start are: The Neural Networks FAQ web-site, and the Neural Network Resources web-site, both of which are rather old now, but still contain a large range of information and links about all aspects of neural networks. This module was previously taught by Peter Tino, and you may find his 2007 module web-site useful.

When programming your own MLP neural networks, it may be useful to start with my Step by Step Guide to Implementing a Simple Neural Network in C. It should be fairly straightforward to see how to use it with related programming languages such as C++ and Java.


This page is maintained by John Bullinaria. Last updated on 30 October 2014.