The mechanical engineer seeks to prove their design by making mathematical calculations about their proposal designed to show perhaps best, worst and average case loads on the proposed systems. The calculations are the expression of some kind of abstract model of mechanics. This abstract model is written in the languages of mathematics. The languages of mathematics is well-understood. For instance, if we have a statement like:

3 + 9 x 4includes two operations, `plus' and `multiply'. We "know" what `+' and `x' mean and also the meaning of `3', `4', and `9'. We probably all know that the rules of arithmetic dictate that we should calculate the answer as if the expression where bracketted as:

3 + (9 x 4).Let's transfer this example to computing. If we write a program in (for instance) C, how do we know that it works? The process of debugging itself won't help, for debugging has been defined as replacing one set of errors with another. More optimistically, it can reveal bugs but can't prove that there are no more present.

If we are programming in a language like C then we could try to write down the meaning of our program in an abstract language in the same way that a mechanical engineer would model their design in mathematics. The problem is that it is very difficult to find an abstract language that is suitable. Ultimately, if you want to write the meaning of a program written in something like C, Pascal, Fortran, COBOL or the like, you have to describe the operations in terms of the effects the execution of the program has on the workings of a conventional computer, the so-called Von Neumann machine.

To those who've grown up with computers and worked with them since teenage years, the Von Neumann architecture seems very normal. However, try explaining a process like sorting a list of names in terms of a Von Neumann machines to an intelligent person, eg from Mars or (more reasonably) to a graduate without computing experience. Their reaction is likely to range from silent disbelief to the expression of amazement that your intelligent mind finds satisfaction in pushing little pieces of information around such an unintelligent machine.

You may think this an exaggeration, but let's go back to the mathematical example. Suppose I were to tell you that the meaning of

3 + 9 x 4is the following:

Press ON/CLEAR Press 3 Press + Press ( Press 9 Press x Press 4 Press ) Press = Read 39 from displayWhat I have done is to claim the meaning of 3 + (9 x 4) is the sequence of keys I have to press on my rather old calculator. Is this a meaning? If it is a possible meaning, is it an efficient expression of a meaning that isn't dependent on knowledge of a particular (perhaps arbitrary) machine?

Languages like C, Pascal, Fortran, Basic, etc are said to be *procedural*
languages. It is only possible to talk about their meaning with reference to
the procedure that they perform on the Von Neumann machine.

*
Declarative* languages differ from procedural languages on the key point
that it is possible to talk about their meaning with reference to some abstract
language rather than with reference to machine.

There are two families of declarative languages. *Functional* languages
use the language of mathematical functions to provide their meaning. Programs
are written in the form of functions, each of which returns a value. The
meaning of a function call is defined in terms of the value returned. So the
meaning of a function can be seen as a declaration of its returned value.

*
Logic* languages, as their name suggests, rely on logic to provide the
framework for their meaning. Logic itself was invented as an aid to human
thought, allowing knowledge to be set out and reasoned about. It would
therefore seem natural to attempt to use it in programming computers. In
practice, first order predicate calculus is used, or at least a subset of it.
It is not necessary to understand predicate calculus before learning a logic
programming language. Indeed, this course takes the view that it is better to
learn Prolog, a practical example of a logic programming language, before
learning about the underpinning of theory.

Logic-based languages had a prehistory with American attempts such as
Micro-Planner (in which Terry Winograd built the pioneering blocks world
natural language processing system, SHRDLU) and Conniver. Both attempts failed
to replace LISP as an artificial intelligence programming language because they
were extremely inefficient. The fact that Prolog overcame the problems that
bedevilled Micro-Planner and Conniver is due to the work by David H D Warren
and others at the University of Edinburgh on the efficient implementation of
Prolog. Warren's name has passed into logic programming in the name given to
Prolog's commonly used implementation technique, the *Warren Abstract
Machine*.

It would be incorrect to consider logic programming (and Prolog in particular) as purely artificial intelligence tools. Prolog has moved out to other problems, for instance complicated scheduling situations (such as assigning ships to berths), warehousing problems, and database problems.

Declarative languages lend themselves to manipulation and transformation. For instance, a program can be written in one form (perhaps with one task in mind) and transformed into a different form to produce another program that will run more efficiently, either for the same task or for a closely related variant. Another area of study is the relation between the specification of systems and its transformation into a working program. This is greatly helped by the existence of the abstract semantic systems common to declarative languages.

The other area of interest is in the use of logic programming languages for parallel execution. Prolog itself is a particular manifestation of logic programming that has been adapted for use on a sequential machine (ie the Von Neumann machine). Logic programming in its purest forms has several features that lend themselves to the design of parallel languages. One instance is Parlog[+] designed by Steve Gregory, although there are others.

Before you can learn about the advanced topics, you have first to learn the essentials of logic programming.

This course starts with Prolog, because it is believed that it is easier to learn the theory if you have an idea of how it might relate to a practical instance of the theory.

The Prolog course starts with data types and structures. Alongside these, the
logic variable is introduced. Variables in logic programming work differently
than variables in procedural and functional languages. This comes out most
clearly when matching takes place, for logic programming uses a far more
powerful matching technique (called *unification*) than other languages
do.

Having mastered data types, structures and matching, the other main topic is
the building of programs. Unlike languages like Pascal which use iteration
(with loop constructs like `for`, `while` and
`repeat/until`), Prolog has no iteration as such but uses recursion.
This part of the course is concerned with showing and analysing a variety of
different examples so that you can learn to identify the pattern of recursion
that would best solve the problem in hand.

[+] Parlog was developed by Steve Gregory at Imperial College, London. It is a parallel language that runs on a simulated parallel machine. This simulation of course runs on a sequential machine, such as a SUN. The system is widely available in the UK and is one of the languages available for advanced work in the School. It is described in: Gregory, Steve. (1987).

© P.J.Hancox@bham.ac.uk