Module 12412 (2003)
Syllabus page 2003/2004
06-12412
Introduction to Neural Computation
Level 4/M
Links | Outline | Aims | Outcomes | Prerequisites | Teaching | Assessment | Books | Detailed Syllabus
The Module Description is a strict subset of this Syllabus Page. (The University module description has not yet been checked against the School's.)
Changes and updates
Small change made to final Learning Outcome (29 Sep 2003)
Relevant Links
See Introduction to Neural Computation
Web-page for module material and further useful links.
Outline
Basic Neurobiology; Neural Networks; Single Neuron Models; Single Layer Perceptrons; Multi-Layer Perceptrons; Recurrent Networks; Radial Basis Function networks; Committee machines; Kohonen networks; Applications of neural networks.
Aims
The aims of this module are to:
- introduce some of the fundamental techniques and principles of neural computation
- investigate some common models and their applications
Learning Outcomes
| On successful completion of this module, the student should be able to: | Assessed by: | |
| 1 | understand the relation between real brains and simple artificial neural network models | Examination |
| 2 | describe and explain the most common architectures and learning algorithms for Multi-Layer Perceptrons, Recurrent Networks, Radial-Basis Function Networks, Committee Machines, and Kohonen Self-Organising Maps | Examination |
| 3 | explain the learning and generalisation aspects of neural computation | Examination, assignment |
| 4 | demonstrate an understanding of the implementational issues for common neural network systems | Examination, assignment |
| 5 | demonstrate an understanding of the practical considerations in applying neural computation to real classification and regression problems | Examination, assignment |
Restrictions, Prerequisites and Corequisites
Restrictions:
Excluded combination with 06-02360 (Introduction to Neural Networks).
Prerequisites:
None
Co-requisites:
None
Teaching
Teaching Methods:
2 hrs of lectures per week plus labs
Contact Hours:
Assessment
- Supplementary (where allowed): As the sessional assessment
- 2 hr open book examination (70%), continuous assessment (30%). Resit (where allowed) by examination only with the continuous assessment mark carried forward.
Recommended Books
| Title | Author(s) | Publisher, Date |
| Neural Networks: A Comprehensive Foundation | S Haykin | Prentice Hall, 1999 |
| Neural Networks for Pattern Recognition | C M Bishop | Oxford University Press, 1995 |
| An Introduction to Neural Networks | K Gurney | Routledge, 1997 |
| An Introduction to the Theory of Neural Computation | J Hertz, A Krogh & R G Palmer | Addison Wesley, 1991 |
| Introduction to Neural Networks | R Beale & T Jackson | IOP Publishing, 1990 |
Detailed Syllabus
- Introduction to Neural Networks and their History.
- Biological Neurons and Neural Networks, Artificial Neurons.
- Networks of Neurons, Single Layer Perceptrons (SLPs).
- Learning and Generalization in SLPs.
- Hebbian Learning, Gradient Descent Learning.
- The XOR Problem, Linear Separability, Multi-Layer Perceptrons (MLPs).
- The Back-Propagation Learning Algorithm and its Variations.
- Other optimization algorithms - Line Searches, Conjugate Gradient.
- Bias and Variance, Improving Generalization.
- Constructive Algorithms, Pruning Algorithms.
- Applications of Multi-Layer Perceptrons.
- Recurrent Networks.
- Radial-Basis Function Networks.
- Applications of Radial-Basis Function Networks.
- Committee Machines.
- Kohonen Self-Organising Maps (SOMs).
- Learning Vector Quantisation (LVQ).
- Overview of More Advanced Topics.
- Summary and Review.
Last updated: 29 Sep 2003
Source file: /internal/modules/COMSCI/2003/xml/12412.xml
Links | Outline | Aims | Outcomes | Prerequisites | Teaching | Assessment | Books | Detailed Syllabus