4 February 2002
Bridging Theorem Proving and Mathematical Knowledge Retrieval
Volker Sorge
University of Birmingham
Accessing knowledge of a single knowledge source with different client applications often requires the help of mediator systems as middleware components. In the domain of theorem proving large efforts have been made to formalize knowledge for mathematics and verification issues, and to structure it in databases. But these databases are either specialised for a single client, or if the knowledge is stored in a general database, the services this database can provide are usually limited and hard to adjust for a particular theorem prover. Only recently there have been initiatives to flexibly connect existing theorem proving systems into networked environements that contain large knowledge bases. An intermediate layer containing both, search and proving functionality can be used to mediate between the two. In my talk I shall motivate the need and discuss the requirements for mediators between mathematical knowledge bases and theorem proving systems. I shall also present a first attempt at a concurrent mediator between a knowledge base and a proof planning system.
11 February 2002
An overview of biologically-inspired complex adaptive systems research at HP
David Cliff
HP Labs, Bristol
Research in artifical life, autonomous agents, and virtual environments has been underway in various forms at HP Labs Bristol since at least 1994. After a brief introduction to why HP Labs exists and what we do, I'll give an overview of some of our projects between then and now. This will include work on evolving autonomous agents for collective behaviours; on a telecomms routing system inspired by ant food-foraging; on adaptive agents that bargain and trade in commodity markets (recently demonstrated to be better than human traders); on developing massive persistent online virtual worlds for digital ecosystems used in entertainment applications, and possibly also on an automated dance-music DJ that uses evolutionary computation techniques to compose its own tunes.
25 February 2002
Evolving Decision Principles
Aniko Ekart and S.Z. Nemeth
University of Birmingham
In multicriteria decision problems many subjective decisions must be made, such as the importance of the different criteria and the values of the alternatives with respect to subjective criteria. Since subjective decisions are not very accurate, it is very important to analyse the sensitivity of results when small modifications of the subjective values are made. When solving a multicriteria decision problem, it is desirable to choose a decision principle that conducts to a solution as stable as possible. We propose here a method based on genetic programming that produces better decision principles than the commonly used ones. The theoretical expectations are validated by two case studies.
4 March 2002
Formal Theories of Geographical Information
Antony Galton
University of Exeter
The distinction between object-based and field-based conceptions of geographical reality has become well-known in recent years. It replicates at a conceptual level the distinction between vector-based and raster-based implementation methods for GIS. In this talk I shall endeavour to lay the groundwork for a careful, mathematically rigorous development of the relevant ideas at the conceptual level. The notions of object and field are given precise mathematical definitions, and an appropriate formal apparatus is constructed by which to address such issues as the interconvertibility of the object-based and field-based paradigms, and their relative adequacy for different representational problems. I shall also consider whether there is a need for additional representational structures to do justice to the wealth and complexity of our geographical understanding, and if so, what form these should take.
11 March 2002
AI Without Emotions
Blay Whitby
University of Sussex
Emotion is often held to be an essential (if somewhat poorly understood) ingredient of cognition. It follows from this claim that if we are to produce intelligent machinery then we need to include some sort of emotional element. This is most clearly put by Rosalind Picard in 'The Affective Computer', but has also been simply assumed by other authorities such as Marvin Minsky. This talk challenges the necessity of cognition involving emotion. It is, in principle at least, possible to construct intelligent machinery with no emotional faculties at all. The talk represents work in progress and is intended to provoke further thought and discussion, rather than an unassailable argument.
18 March 2002
Aggregating States for Markov Processes
Jon Rowe
University of Birmingham
When a random process involves a lot of states, it can become impractical to deal with all of them. We would like to group states together into equivalence classes and just consider the transitions between classes. However, this has to be done is a consistent way so that the probability of going from one class to another does not depend on the particular states involved. Given a transition matrix (or in general any linear operator) we seek to characterise the equivalence classes that have this property. This can be (partially) done by looking at a subgroup of the permutations of the set of states. An example will be given involving the mutation operator in genetic algorithms.
25 March 2002
Pattern Recognition and Machine Intelligence: A Web Service Perspective
Simon Lucas
University of Essex
Pattern recognition and machine intelligence algorithms are difficult to design and even harder to evaluate. In this talk I'll explore the novel concept of offering such algorithms as web services. This simple idea leads to a radical rethink of how we design, deploy, evaluate and exploit such algorithms, and will form the basis of the new version of the Algoval evaluation service. While it is straightforward to deploy simple 'hello-world' examples as web services, we'll see that providing useful real-world services is a bit more challenging, but extremely useful and very much worth the effort. I'll illustrate the talk with examples in optical word recognition, evolving computer checkers players, and sequence recognition.
8 April 2002
Comparing Evolutionary and Local Search Algorithms
Darrell Whitley
Colorado State University, USA
In this talk, the concept of "No Free Lunch" will be very briefly discussed as it relates to empirical comparison of search algorithms on real world problems. Then various algorithms will be compared for 3 classes of problems: flowshop scheduling, inverse radiance transfer for modeling clouds and finally, a computer vision problem involving model matching. For each application a different, specialized form of evolutionary algorithm is compared to another specialized form of local search.
15 April 2002
Diversity in Genetic Programming
Steven Gustafson
University of Nottingham
Measures of diversity are often used in the field of genetic programming to describe and characterize the evolutionary search, where more diversity is better. The most popular representation of tree-based individuals in genetic programming, however, posses several problems for genetic programming diversity (other representations would have similar issues as well). Whether to use genotypes (trees), phenotypes (fitness values) or more complex individual comparisons is not clear. Also, whether the cost of computing these measures is worth their benefit and the fact that some measures indicate major loss of diversity early in the evolution highlight the need to study the use and effects of diversity in genetic programming. This talk will describe our preliminary study of surveying and analysing diversity measures and current and future work.
20 May 2002
The Role of Argumentation and Negotiation in Building Legal Knowledge Based Systems
John Zeleznikov
Joseph Bell Centre for Forensic Statistics and Legal Reasoning,
Law School, Faculty of Law, University of Edinburgh
The use of knowledge discovery techniques to construct intelligent legal decision support systems in discretionary domains will: 1) enhance consistent decision-making; 2) by making legal decision making transparent lead to increased confidence in the justice system ; and 3) provide support for alternative dispute resolution. The development of legal decision support systems on the World Wide Web will increase access to legal knowledge. In this talk we discuss how to place legal decision support systems on the World Wide Web. We discuss the role of argumentation and negotiation in building such systems. We describe two tools (WebShell and ArgumentDeveloper) we have developed for placing legal decision support systems on the World Wide Web.
13 June 2002
A Real-parameter Evolutionary Algorithm with a Parent-Centric Recombination Operator
Kalyanmoy Deb
IIT Kanpur, India
Due to an increasing interest in solving real-world optimization problems using evolutionary algorithms (EAs), researchers have developed a number of real-parameter genetic algorithms (GAs) in the recent past. In such studies, the main research effort is spent on developing an efficient recombination operator. Such recombination operators use probability distributions around the parent solutions to create an offspring. Some operators emphasize solutions at the center of mass of parents and some around the parents. In this paper, we propose a generic parent-centric recombination operator (PCX) and a steady-state, elite-preserving, scalable, and computationally fast population-alteration model (we called the G3 model). The performance of the G3 model with the PCX operator is investigated on three commonly-used test problems and is compared with a number of evolutionary and classical optimization algorithms including other real-parameter GAs with UNDX and SPX operators, the correlated self-adaptive evolution strategy, the differential evolution technique, and the quasi-Newton method. The proposed approach is found to be consistently and reliably performing better than all other methods used in the study. A scale-up study with problem sizes up to 500 variables shows a polynomial computational complexity of the proposed approach. This extensive study clearly demonstrates the power of the proposed technique in tackling real-parameter optimization problems.
17 June 2002
AI For Computer Games: An Overview and Example
Nick Hawes
University of Birmingham
The talk will start with an overview of AI research for computer games. This will cover why such research is needed and what it can offer the academic community. This will lead onto some views on the (lack of) progress such research has made, and the possible causes of this. After this rather abstract introduction, a specific example of AI research for games will be presented. This example is an agent for Unreal Tournament that uses an anytime hierarchical task network planner as its principle behaviour generation method.
27 July 2002
Evolving SQL Queries for Data Mining
Majid Salim
University of Birmingham
A methedology is presented which uses the principles of Evolutionary Computation (EC) to evolve SQL statements that classify datasets. The approach being used is simple and novel, but appears to be successful in several instances. This talk will describe the methedology that has been designed, will demonstrate some of the results that have been obtained, and end with a discussion of shortcomings and future work that is planned.
16 September 2002
A Review of Using C++ for AI Programming
Steve Leach
Watchfield.Com Ltd.
Over the past year I have been working on a next generation text recognition engine for a major software vendor. Written entirely in C++, it includes a slew of techniques familiar to AI programmers. This work has provided an opportunity to reflect upon the merits and demerits of modern C++ in the context of an AI application. C++ itself has advanced somewhat in recent years and some of these improvements should, at least on paper, be of benefit to non-numerical programmers. I will begin this talk by briefly reviewing the nature of AI programming and the salient features of C++. The main section will highlight the matches and mismatches, draw out the implications for effective programming, and illustrate these with examples drawn from my work. I will then compare the use of C++ against some of the languages more commonly used for AI. Lastly, these comparisons naturally lead to some proposals for future programming languages.
23 September 2002
Factoring the Structure of a Thesaurus into a Semantic Classification Decision
Viktor Pekar
Bashkir State University
Ufa, Russia
The talk is concerned with the task of automatically augmenting an existing lexical resource (thesaurus) with new lexical items found in a corpus. Thereby, a new word is assigned to a class on the basis of its distributional similarity to the class. The talk reports results of a study which examined different ways to take advantage of the taxonomic relations between classes during the classification procedure. The paper compares three standard classification algorithms that have been previously applied to this task (the k nearest neighbor, the centroid-based and the class-based methods) and the two methods that incorporate the structure of a thesaurus into the classification procedure (the so-called top-down and bottom-up search).
30 September 2002
Generalised Dynamic Search with Charged Swarms
Tim Blackwell
UCL, London
A methedology is presented which uses the principles of Evolutionary Computation (EC) to evolve SQL statements that classify datasets. The approach being used is simple and novel, but appears to be successful in several instances. This talk will describe the methedology that has been designed, will demonstrate some of the results that have been obtained, and end with a discussion of shortcomings and future work that is planned.
7 October 2002
Polarisation Imaging for Skin Characterisation
S.P. Morgan
School of Electrical and Electronic Engineering
University of Nottingham
Much effort has been directed towards the development of non-invasive optical techniques to interrogate the human body for medical purposes. These techniques provide fast diagnosis with minimal discomfort to the patient and disruption to the area of interest. The main drawback is that light is heavily scattered within tissue and this leads to uncertainty in the volume from which information is retrieved. There is an increasing interest in the use of optical techniques for characterising layered media for applications such as burn depth measurement and melanoma diagnosis. Polarised light imaging utilises the property that light depolarises as it propagates and the initial polarisation of incident light is lost within relatively few scattering events. This can be used to localise light in the superficial tissue close to the surface. Using Monte Carlo simulations and neural network inversion we demonstrate the use of polarised light measurements to extract the optical properties and thickness of layered scattering media. A drawback of polarisation imaging is that light reflected from the surface of the tissue dominates the image. An optically flat plate and matching fluid applied to the tissue surface, combined with off-axis detection has previously been used to eliminate this problem. However this is often inappropriate or inconvenient for practical use. A technique has been developed which combines images obtained with linearly and circularly polarized light to produce an image of light localised within the superficial layers that is free from surface reflections and does not require optically flat plates or matching fluid.
14 October 2002
Lucy and the Quest for the Holy Grail
Steve Grand
Cyberlife Research
The cerebral cortex seems to be a general-purpose machine with the capacity to self-organise into an assortment of useful specialist machines, guided only (or at least mostly) by the information being supplied to it. This is such a striking facility that one might expect models of cortical function to lie at the heart of most AI research, yet in practice it seems to receive very little attention. There are several very good reasons for this, but it doesn't follow that the problem is intractable. Lucy the robot is my (amateur and unfunded) attempt to discover the basic engineering principles that lie behind cortical organisation. The project is still in its early days, but is nevertheless starting to show promise. This is a progress report.
21 October 2002
Modelling Complex Systems
George Milne
University of Western Australia
Approaches to modelling complex systems will be discussed.
23 October 2002 - WEDNESDAY
Evolution of Hierarchical Modularity in Intracellular Interaction Networks
Jennifer Hallinan
University of Queensland
Networks of interactions evolve in many different domains. They tend to have topological characteristics in common, probably due to common evolutionary factors. It has been recently suggested that one such common characteristic is the presence of a hierarchically modular organization. We describe a new algorithm for the detection and quantification of hierarchical modularity, and demonstrate that the yeast protein-protein interaction network does have a hierarchically modular organization. We further show that such organization is evident in artificial networks produced by computational evolution using a gene duplication operator, but not in those evolving via preferential attachment of new nodes to highly connected existing nodes.
28 October 2002
Evolving French Flags and Other Unlikely Organisms
Julian Miller
School of Computer Science
University of Birmingham
Living organisms are some of the most complex designs in the universe. Yet, unlike human designs they are constructed from the bottom-up at a molecular level. Most living organisms begin as a single cell. This replicates and differentiates and eventually forms a whole organism. There is no centralised organisation behind this. The talk describes my work on using a form of Genetic Programming to evolve the program for a single cell that allows it to develop in a way that parallels biological development. Interesting growing multi-cellular "organisms" that grow to arbitrary size can be produced. These artificial organisms appear to be able to recover when damaged. Sometimes the evolved cell programs develop structures that have an elegant mathematical precision.
4 November 2002
Modelling Human Compositional Processes using a Genetic Algorithm
Andrew Gartland-Jones
Centre for Computational Neuroscience and Robotics
University of Sussex
There has now been a substantial body of work utilizing Genetic Algorithms (GA's) for the purpose of musical composition. A common point of discussion is how far GA's can simulate not just the musical output of human composers, but the process of composing itself. I will discuss the construction of a Generative Music System that utilizes a domain specific, knowledge rich GA. The system attempts to simulate at least some of the processes of composition in order to generate new musical material from two supplied fragments.
11 November 2002
Unifying Evolutionary Dynamics
Karen Page
University College London
Darwinian evolution is based on three fundamental principles: reproduction, mutation and selection, which describe how populations change over time and how new forms evolve out of old ones. There are numerous mathematical descriptions of the resulting evolutionary dynamics. In this paper, we show that apparently very different formulations are part of a single unified framework. At the centre of this framework is the equivalence between the replicator-mutator equation and the Price equation. From these equations, we obtain as special cases adaptive dynamics, evolutionary game dynamics, the Lotka-Volterra equation of ecology and the quasispecies equation of molecular evolution.
18 November 2002
Self-Assembly of DNA-like structures In-Silico
Max Garzon, Derrel Blain, Kiran Bobba, Andrew Neel, Michael West
Computer Science Division, The University of Memphis
Memphis, TN 38152 U.S.A
Biomolecular computing (BMC) aims to capture the innumerable advantages that biological molecules have gained in the course of millions of years for computational purposes. While biomolecules have resolved fundamental problems as a parallel computer system that we are just beginning to decipher, BMC still suffers from our inability to harness these properties to bring biomolecular computations to levels of reliability, efficiency and scalability that are now taken for granted with solid-state based computers. In the same way that evolutionary algorithms capture, in silico, the key properties of natural evolution, we explore an alternative approach to exploiting these properties by building virtual test tubes in electronics that would capture the best of both worlds. We describe a distributed implementation of a virtual tube, Edna, on a cluster of PCs that aims to capture the massive asynchronous parallelism of BMC. We report results from three large experiments that benchmark Edna's performance, reproduce and extend Adleman's solution to the Hamiltonian Path problem for larger families of graphs than has been possible on a single processor or has been actually carried out in wet labs, and benchmark the feasibility and performance of DNA-based associative memories. The results provide strong evidence that the paradigm of molecular computing can be implemented much more efficiently (in terms of time, cost, and probability of success) in silico than the corresponding wet experiments, at least in the range where eDNA can be practically run. Consequently, we pinpoint the appropriate range of problem sizes and properties where wet biomolecular solutions would offer superior solutions. This approach also shows an intrinsic advantages in using DNA like genomes for genetic algorithms and evolutionary computation.
25 November 2002
Evolution of Sensors
Daniel Polani
University of Hertfordshire
Nature shows a wide variety of cases where sensors evolved to capture information from the environment. The wide spectrum of variants as well as the predominance of specific sensoric modes indicate that certain fundamental principles underlie the evolution of sensors. The talk will give an introduction into the topic, give an overview over particularly interesting phenomena and address some developments how the process of capturing relevant sensoric information can be theoretically understood.
18 December 2002 - WEDNESDAY
Cross-Document Coreference: Methodologies, Evaluations and Observations
Amit Bagga
Avaya Labs Research
Cross-document coreference occurs when the same person, place, event, or concept is discussed in more than one source. Computer recognition of this phenomenon is important because it helps break "the document boundary" by allowing a user to examine information about a particular entity/event from multiple text sources at the same time. In this talk, I will describe a system that resolves both entity and event coreference across documents. Issues regarding the evaluation of such systems will also be discussed. In addition, I will also describe extensions made to the original system including a 2-pass algorithm that works over degraded data sources, and a methodology for cross-media coreference. Finally, I will discuss future directions of research including cross-language coreference.