# Notes

This page will contain the lecture slides as they are required (usually 24 hours before each lecture), plus links to any supplementary material that may be useful as starting points for further enquiry. It will also list the preparatory reading prior to the lecture and include information on the PeerWise work related to it. The video from the lectures will be available as an online video , or as a download. You can also subscribe to the IAI channel on Vimeo. If you have any problems accessing any of the media, please let me know.

You are expected to have read the material listed from Artificial Intelligence: A Modern Approach (AI:AMA) before the subsequent lecture. The PeerWise questions are set to check whether you've done the reading, and to help you to start understanding it. The lectures should help you to understand the material if you had trouble getting to grips with it. Anything that is covered in the reading will be considered a legitimate topic for the exercises and the exam.

The reading is taken from the AI:AMA 3rd edition, but there should be equivalent content in earlier editions. I chose to use this book because it provides an excellent, deep coverage of most of AI, and will therefore be useful throughout your degree. If you do not wish to buy the book there should be copies in the school and university libraries.

Content Supplementary Material
Lecture 20 27/11/12
Linear Classification
vimeo

Past exam questions

2012: Q6

In this lecture we look at the problem of fitting a straight line to some training data in order to use it to separate the data into two classes, and thus predict which class a future data point falls into.

This tag covers reading from Section 18.6 - Regression and Classification with Linear Models.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above. This is the last PeerWise assignment for the module.

Set: 21/11/11

AI:A Modern Approach

Section 18.6 - Regression and Classification with Linear Models

Section 18.4 - Evaluating and Choosing the Best Hypothesis

Section 18.5 - The Theory of Learning

Try not to worry too much about the maths in general and the derivations in particular. For this module we will just use the formulae without trying to understand how to obtain them. You shouldn't need to understand them to do the online questions or assessed exercise.

Lectures 18 and 19 21-27/11/12
Univariate Linear Regression

Linear Regression 1

vimeo

Linear Regression 2

vimeo

In this lecture we look at the problem of fitting a straight line to some training data in order to use it to predict future data points.

This test covers reading from Section 18.3 - Learning Decision Trees.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Lectures 15, 16 and 17 15/11/12 - 20/11/12
Learning Decision Trees (also slides with transitions)

Learning Decision Trees

vimeo

Entropy and Information Gain

vimeo

Entropy Remainder

vimeo

Past exam questions

2010: Q4a

2011: Q1

2012: Q5

In this lectures we'll look at the decision tree representation, and the ID3 algorithm for learning them from training data. Part of this will include looking at information gain (entropy reduction) as a mechanism for selecting an attribute test when learning a decision tree from data.

You can try the examples from the lecture on AISpace, using the Decision Trees tool. You can use the following links to get xml files for the problems: the film data; and the lectures data. Load them into the tool using the "Load from File" option under the "File" menu.

This test covers reading from Section 18.3 - Learning Decision Trees.

Set: 9/11/11

AI:A Modern Approach

Section 18.3 - Learning Decision Trees

Background Reading (not assessed, covered in part by Lecture 14)

Section 18.0 - Learning from Examples

Section 18.1 - Forms of Learning

Section 18.2 - Supervised Learning

Section 18.4 - Evaluating and Choosing the Best Hypothesis

Section 18.5 - The Theory of Learning

Try not to worry too much about the maths in general and the entropy calculations in particular. We will go through these in a lot more detail in the lecture. You shouldn't need to understand them to do the online test.

Lecture 14 6/11/12
Introduction to Machine Learning
No video due to power cut. See last year.

In this lecture we look at the basic approaches to machine learning: supervised learning, unsupervised learning and reinforcement learning. We also look at the idea of training and test sets.

Lecture 13 5/11/12
Markov Chains

In this lecture we look at Markov Chains as a way of reasoning about probabilistic variables over time.

This test covers material from AI:A Modern Approach section 13.5, 14.1 and 15.1.

The purpose of this assessment is to test whether you have read the set material, not whether you really understand it yet. Therefore, if you've struggled with the reading, please attempt as much of this test as possible and then attend the lectures to get help understanding the material.

Set: 31/10/12

AI:A Modern Approach

Section 13.5: - Bayes’ Rule and It’s Use

Section 14.1: Representing Knowledge in an Uncertain Domain

Section 15.1: Time and Uncertainty

Advanced Reading. This will not be assessed, but provides further knowledge on the topics we will cover for those interested in stretching themselves.

Section 14.2: The Semantics of Bayesian Networks

Section 15.2: Inference in Temporal Models

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Lectures 11 and 12 30-31/10/12
Bayes' Rule and Probabilistic Graphical Models

Bayes' Rule and Conditional Independence

vimeo

Bayesian Networks

vimeo

Past exam questions

2005: Page 11

2006: Page 12

2010: Q1

2011: Q5

2012: Q3 and Q8b

In these lectures we'll look at Bayes' rule, particularly in the context of diagnosis, before moving on to Bayesian networks (a type of probabilistic graphical model).

You can try the example from the lecture on AISpace, using the Belief and Decision Networks tool. Here is the link for the xml file for the Matilda meal problem Bayes net. Load it into the tool using the "Load from File" option under the "File" menu.

In this Undercover Economist article Tim Harford discusses the importance of Bayesian reasoning in an informal setting. For IT applications of Bayes nets see this LA Times article and this Economist article.

In this scientific article, Judea Pearl, one of the founders of the probabilistic AI movement, discusses the strengths and weaknesses of Bayesian methods in AI. You can also watch his 2011 ACM Turing Award Lecture.

This test covers material from AI:A Modern Approach section 13.5, 14.1 and 15.1.

The purpose of this assessment is to test whether you have read the set material, not whether you really understand it yet. Therefore, if you've struggled with the reading, please attempt as much of this test as possible and then attend the lectures to get help understanding the material.

Set: 24/10/12

AI:A Modern Approach

Section 13.5: - Bayes’ Rule and It’s Use

Section 14.1: Representing Knowledge in an Uncertain Domain

Section 15.1: Time and Uncertainty

Advanced Reading. This will not be assessed, but provides further knowledge on the topics we will cover for those interested in stretching themselves.

Section 14.2: The Semantics of Bayesian Networks

Section 15.2: Inference in Temporal Models

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Lectures 9 and 10 23-24/10/12
Probabilistic Inference

Probabilistic Inference 1

vimeo

Probabilistic Inference 2

vimeo

Probabilistic notation: handout

Past exam questions

2012: Q2 and Q8a

In these lectures we'll look at the fundamental concepts necessary for probabilistic inference.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Set: 17/10/12

AI:A Modern Approach

Section 13.1 - Acting Under Uncertainty

Section 13.2 - Based Probability Notation

Section 13.3 - Inference Using Full Joint Distributions

Section 13.4 - Independence

Valid from after the lecture on October 19th. The first 2 sections recap what we covered in the first lecture on probabilistic AI. The second 2 second sections build on this to give you some basic tools for doing inference.

Lecture 8 17/10/12
Probabilistic AI

In this lecture we look at some of the fundamental ideas and notation in probability theory that we will build on in future lectures.

Lecture 7 16/10/12
vimeo

Past exam questions

2010: Q2c

2012: Q4

In this lecture we look at adversarial search, particularly the minimax and alpha-beta pruning algorithms.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Set: 10/10/12

AI:A Modern Approach

Section 5.1 - Games

Section 5.2 - Optimal Decisions in Games

Section 5.3 - Alpha-Beta Pruning

Valid from after the lecture on October 10th. This reading builds on the understanding you have developed of search by introducing adversarial search algorithms (where you are searching for solutions to defeat an opponent).
Lectures 5 and 6 9-10/10/12
Informed Search

Informed Search 1

Informed Search 2

Past exam questions

2005: Page 10

2011: Q7

2012: Q1b

In these lectures we'll look at informed search (greedy best-first search and A* search), and some of the properties of these searches and associated heuristics.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Set: 05/10/12

AI:A Modern Approach

Section 3.5 - Informed (Heuristic) Search Strategies - stop before 3.5.3

Valid from after the lecture on October 3rd. This reading builds on the understanding you have developed of uniformed search by introducing informed (or heuristic) search algorithms that make sure of knowledge about the search task in addition to in problem description.
Lectures 3 and 4 02-03/10/12
Uninformed Search

Uninformed Search 1

Uninformed Search 2

Past exam questions

2005: Page 8.

2006: Pages 9 & 10

2007: Q5

2008: Q2a & Q5

2009: Q7

2010: Q2a & Q2b

2011: Q6

2012: Q1a

In these lectures we'll look at the various forms of uniformed search.

If you want a question you create for this tag to count towards your total for the term, remember to create it before the deadline above.

Set: 26/09/12

AI:A Modern Approach

Section 3.1 - Problem Solving Agents

Section 3.2 - Example Problems

Section 3.3 - Searching for Solutions

Section 3.4 - Uninformed Search Strategies

The reading for next week covers the foundational concepts of AI search and uninformed search strategies. This repeats some of the material from the first week's lectures so you have more reading to do than you would normally have (i.e. material for nearly two weeks rather than one).
Lecture 2 26/09/12
Problem Solving as Search
vimeo

In this lecture we take our first look at formulating problems as search problems, and cover the fundamental elements of search problems: states and actions.

Set: 13:00 24/09/12

In Intro to AI we will be using PeerWise as a place for you to create, share and evaluate assessment questions with your classmates. Start by visiting PeerWise here: http://peerwise.cs.auckland.ac.nz/at/?bham_uk

If you have not used PeerWise before, just click the "Registration" link and follow the prompts. All you need to do is choose a username and a password for your PeerWise account. The username can be anything you want (but it will be viewable by staff and other students).

Once logged in select "Join course" from the Home menu.

To access our course, "Introduction to AI", you will need to enter two pieces of information:

1. Course ID = 6553

Once you are in, please answer at least four questions tagged with "warm-up". These questions are on topics about the School of Computer Science, the University of Birmingham or the structure of this module (i.e. they are not on AI themselves). To see your unanswered questions for this topic, use this link (once registered and logged in).

During this term you are also required to create at least four questions on PeerWise. Each question you create must use a different tag created by one of the module admins, and it must be created before the deadline for answering questions for that tag has passed (e.g. if you wish to ask a question for the "warm-up" tag, you must do so before the deadline associated with this test, 26/09/2012)

Lecture 1 24/09/12
Introduction to Introduction to AI
vimeo