School of Computer Science THE UNIVERSITY OF BIRMINGHAM CoSy project

Computation and Embodiment: Three issues, not two
Aaron Sloman
Installed: 27 Nov 2008
Last updated: 6 Jun 2009; 22 Dec 2009

Background: Origins of this note

This started as a brief comment on one of the points made in this paper:
Grand Challenge 7 -- Journeys in Non-Classical Computation
By Susan Stepney, Samson Abramsky, Andy Adamatzky, Colin Johnson & Jon Timmis, 407 -- 421
Available here.

In Proceedings of Visions of Computer Science: BCS Research Conference
September 2008.
Eds Samson Abramsky, Erol Gelenbe and Vladimiro Sassone
Proceedings available online here (PDF)

The paper claims (page 412) that there are two important issues related to embodiment, namely (a) real-time close coupling (which is often how 'embodiment' is interpreted) and (b) computation directly by physical and chemical processes, in
Section 3.5 Embodiment and in materio computing

We choose to distinguish embodiment (real-time close coupling of a computational device with its complex environment) from in materio computing (computation directly by physical and chemical processes of a complex substrate, with little or no abstraction to a virtual machine). Most authors choose to conflate these properties (for example [MacLennan 2008]), requiring embodiment to be physical. However, separation of these concepts allows us to consider virtual embodied systems [Stepney 2007], for example, the embodiment of software agents closely coupled with a complex but virtual internet environment, and to consider non-embodied in materio computation (computation by a physical medium [Miller & Downing 2002] [Harding et al 2008]), where the computation performed by that medium need not be real-time, nor closely coupled with a complex environment.

In contrast, I think there are three things to be distinguished, of which the third is most important for understanding evolutionary pressures driving the development of biological cognitive systems:
  1. Environmental interfacing:
    Real-time close coupling of a computational device with its complex environment. E.g. use of compliant effectors to reduce the problems of precise control of manipulations (e.g. compliant wrist).
    (Real time, "online" control is the focus of a great deal of research on 'embodiment'.)

  2. In materio computing:
    Computation directly by physical and chemical processes of a complex substrate, e.g. neurons, molecules, analog electronic circuits, fluids, flexible but unstretchable strings (e.g. for shortest route computations).

    Stigmergy, e.g. the use of pheromone trails by insects, and similar processes where changes in some part of the environment resulting from actions of agents provide re-usable information for those agents, could be regarded as another form of 'In materio' computing, done outside the agent or agents.

  3. Ontologically matched computing:
    Computation using an information-processing architecture, including forms of representation, algorithms, information structures and information contents (including ontologies), geared to features of the environment, as opposed to just being general purpose mechanisms, and required for other purposes than real-time physical interaction, e.g. planning, explaining, predicting, remembering, reporting.

    Examples: perceptual systems or effector systems that are hierarchically organised because things perceived in the environment and actions produced in the environment involve different levels of abstraction (see the diagrams below):

    When you are reading or producing written text, the levels can include coloured portions of surfaces, letters, words, syntactic forms, propositions expressed, stories, etc.

    Someone looking at a clockwork mechanism may see and think about static and moving surface fragments, larger functionally distinct components (gear wheels, springs, axles, flywheel, pendulum, escapement mechanism), geometrical, causal and functional relationships, materials with different properties (rigid, elastic, etc.)

    Processes of different sorts can occur in which geometrical, physical, causal and functional relationships change, e.g. processes like one gear wheel turning another, or a flywheel or pendulum plus escapement mechanism, controlling the speed at which a spring or hanging weight drives the whole machine.

    Someone watching or thinking about humans or animals interacting sees or thinks about moving surfaces, intentional actions, weapons being used, social interactions (e.g. seeing two people fighting, or courting, or collaborating in getting a settee through a doorway.)

    This is a bit like 'impedance matching' in electrical circuits.

    See also

Note added 7 Feb 2009: Levels of dynamical system

Simple Dynamical system
Fig 1: This illustrates environmental interfacing, where most parts of a dynamical system (but not necessarily all parts) are closely and directly coupled with the environment via sensors and effectors, as in many control systems.

Layered Dynamical system
Fig 2: This illustrates a layered dynamical system, composed of many sub-systems, where some sub-systems are, as before, closely coupled with the environment through sensors and effectors involved in online monitoring and control, while other sub-systems (more to the right, in the diagram) are not directly interfaced with the environment using sensors and effectors, but nevertheless are used to represent information about the environment, including past and future and spatially remote parts of the environment. The latter may represent the environment at different levels of spatial and temporal granularity, e.g. representing topological and functional relationships and changes, rather than metrical ones alone.

The latter requires use of structures and processes that are "ontologically matched" to the contents of the environment, in a way that need not depend on what sensors and effectors are used, since entities and events that are remote in time and space cannot be directly sensed or acted on. These could be described as deep features of the environment, including properties of various kinds of material. The biological variants of this would have evolved relatively late. The red arrows indicate that some of the more abstract subsystems have semantic contents referring to things in the environment that cannot be directly sensed or acted on. This could include some of the high level percepts, e.g. seeing something as fragile, or as a person, or as a person intending to perform a certain action.

Different forms of representation, used for different purposes, are needed in different layers of such a system.

A lot of AI research has addressed both ends of the spectrum of dynamical subsystems, but has not integrated them fruitfully.

Some steps towards an appropriate architecture are described in here.

The Juggling octopus

Jackie Chappell has drawn my attention to this news item about an octopus that juggles crabs and squirts water at a light in order to extinguish it.

  Otto the octopus wreaks havoc
  A octopus has caused havoc in his aquarium by performing juggling
  tricks using his fellow occupants, smashing rocks against the glass
  and turning off the power by short-circuiting a lamp.
  Last Updated: 12:22PM GMT 03 Nov 2008
The language used by the keepers to describe the mental states and processes of the octopus is very interesting.

Those who argue that contents of cognition depend crucially on body morphology may wish to consider whether an octopus and a human could acquire and use the same information about the environment, or whether the information used when an octopus looks at a target before squirting at it has nothing in common with the information used when a human looks at a target before throwing stones at it.

Added 7 Feb 2009
In both cases there is a 3-D space, an object some distance away from the perceiver-actor, something to be given an initial velocity and direction of movement that will ensure that its subsequent behaviour in accordance (roughly) with physical laws will bring it into contact with the distant object.

It is true that the sensory apparatus providing the information from which distance and direction must be inferred can be very different in different organisms, and it is also true that the motor signals to transducers in the effectors will have to be very different if the effectors are as different as a squirting device and a throwing device.

But it does not follow that there's nothing in common in the forms of representation used at the highest levels of the control system at which targets are selected and actions initiated, and nothing common in the ontologies used, and nothing common in the information acquired and used. Moreover, if there is a particular form of representation that is specially useful for computing initial ballistic trajectories from spatial relationships, then it may have been developed in different organisms with different morphologies by 'convergent evolution'.

Equally it is possible that the 'compiled' versions of the computations that can very rapidly compute required trajectories and compute required motor signals may be highly tailored to the body morphology: that would, for example be the form of representation resulting from much practice that produces rapid and fluent action. In that sense, the compiled, highly trained, skills of a human will have to change as the human grows and alters shape, size, weight of body parts, relative lengths of body parts, etc. That morphology-dependent form of representation can change while other forms of representation used for other purposes, e.g. planning, predicting, explaining, causal reasoning, theorising, and communicating, retains its more abstract form.

This difference between transient information required for online control (servoing) and enduring information required for other purposes such as predicting, explaining, describing, communicating, retrospective analysis, etc. is what people fail to understand when they claim that the difference between ventral and dorsal streams in humans is concerned with the difference between processing "what" information as opposed to "where" information, which is a daft theory, since "what" involves "where" for any structured complex object.

Compare A. Sloman, 1982, Image interpretation: The way ahead?, in Physical and Biological Processing of Images Eds. O.J. Braddick and A.C. Sleigh., pp. 380--401,
(See sections B2 and B6.)

A thinker (e.g. a mathematician) could be disembodied

See the discussion of a disembodied mathematician, in:
As I've pointed out more than once in the past, there is no logical reason why it should be impossible for a disembodied mathematician to be passionately concerned with investigations in pure number theory.

Its time could be fully occupied exploring conjectured theorems, trying to produce counter examples to conjectured non-theorems, searching for more general, more elegant, and deeper theoretical frameworks, etc. Highly motivated mathematicians will be able to give many more examples.

Such a disembodied mathematician could be delighted with successful proofs, disappointed or even depressed on discovering fallacies in its arguments, hopeful about lines of investigation that look promising, surprised at unexpected relationships, frustrated at repeated failures, and so on and so on.

None of this logically requires eyes, ears, legs, relationships with other people, etc, etc.

I remember long ago when I was a mathematics student the days I spent lying flat on my back with my eyes closed trying to find a proof of a theorem I had read about, namely that if A and B are two sets and there's a one to one correspondence between A and a subset of B and a one to one correspondence between B and a subset of A then there's a one to one correspondence between B and A. (The Cantor/Schroeder/Bernstein theorem??? It looks trivial but isn't.)

The fact that I had to occasionally get up and eat, etc. was just a distraction. I fear that readers who have never had such experiences will not have any idea what I am talking about. Sounds as if Anders is one of them.

Of course, in my case there was a history of being embodied that led to my exposure to the problem and the language and concepts and techniques required for its formulation and solution. But that's just a contingent fact about me.

There's no logical reason why a disembodied fanatical mathematical researcher should not be created all ready to go: as long as it has the right architecture to support the process. (Whether any architecture implemented in a digital computer could suffice, or some Penrose-type quantum gravity machine is required is a separate question which I'll ignore for now.)

Maintained by Aaron Sloman
School of Computer Science
The University of Birmingham