Overview of project [home]


Metaphor is a prominent research focus in many fields: Linguistics, Literature, Philosophy, Psychology, Psychiatry, Education, Business, Politics, etc. Metaphor is important in all sorts of mundane discourse: ordinary conversation, news articles, popular novels, advertisements, etc. Issues of prime human interest -- such as relationships, money, disease, states of mind, passage of time -- are often most economically and understandably conveyed through metaphor. This ubiquity of metaphor presents an important challenge to how Artificial Intelligence (AI) systems both understand inter-human discourse (e.g. newspaper articles) as well as produce more natural-seeming expressions of metaphor. However, most research into cognitive mechanisms for metaphor, within and outside AI, has been about its understanding rather than its generation. Despite increasing interest in its generation, we are even further from an adequate account than we are in the case of metaphor understanding. To redress the balance towards generation of metaphor, we directly tackle the role of AI systems in communication modelling, uniquely combining this with corpus-based results to guide output to more natural forms of expression.

Our project aims to improve metaphor processing so as to make it more capable of playing a role in natural language processing (NLP). This will lead to greatly increasing the relevance and usefulness of NLP in a whole range of everyday activities, and thereby help to improve inclusivity of people in the "digital economy" of Europe. Evisaged flow-on effects include improving language teaching technology, healthcare technology, and general service forms of electronic communications (libraries, schools, transportation).



Create, in prototype and partial form, a natural language understanding and generation system, orientated towards dialogue, using state-of-the-art generation mechanisms, and focusing mainly on metaphor.


Include, in that system, some initial mechanisms for choosing: whether to use metaphor at all; particular metaphorical conceptions; and particular metaphorical words or phrases.


Include some initial provisions in that system for creating natural forms of metaphorical expression as revealed by study of language corpora.


The architecture of the system we are proposing is essentially a pipeline of modules, each of these modules being completed systems in their own right, specialised for a particular task. We come back to how this pipeline is constructed later. Immediately below is a brief sketch of each of the modules: ATT-Meta, Embodied Construction Grammar, and Dynamic Syntax.

ATT-Meta. Barnden's existing approach and implemented Artificial Intelligence system for carrying out the reasoning required for metaphor understanding (e.g. Barnden 2008). ATT-Meta is highly appropriate for the project as it is geared towards a central phenomenon: the inclusion of elements that open-endedly go beyond conventional metaphorical wording and more broadly the immediate capabilities of known metaphorical mappings between subject matters. But ATT-Meta also has a unique feature based on a novel insight into metaphor. The ATT-Meta approach holds that reverse use of mappings, as well as the normal forwards use, is desirable for some aspects of understanding. In other words, ATT-Meta has the ability to transfer information from target-to-source, as well as in the more usual source-to-target direction. This reversed transfer is held to be crucial for the understanding of some metaphor, but can be adapted also for generation. ATT-Meta's existing, implemented reverse use of mappings will provide our starting point into metaphor generation.

Embodied Construction Grammar (ECG). ECG is a language understanding (but not generation) system having aspects highly congenial to metaphor, and of interdisciplinary significance (Feldman 2010). We will use ECG because it employs a comprehensive model of the linguistic/conceptual interface, and has a current implementation (Bryant 2008). ECG models the conceptual level as interconnected schemas, and the linguistic level, together with connections to the conceptual level, as interconnected constructions. Schemas are complex conceptual structures with parts called "roles" and constraints on them. E.g., there is a schema for the concept of somebody realising a transferer role transferring something to somebody else realising a transferee role. ECG's schemas are more extensively and deeply developed than meaning representations used in existing NLG systems. They are also strongly geared towards conceptual representations studied in Cognitive Linguistics; and the ATT-Meta system's representations broadly have the same orientation. The notion of construction is familiar from work within Cognitive Linguistics more generally (e.g. Croft & Cruse 2004). E.g, an English ditransitive verb may employ a particular type of ditransitive construction having three constituents, subject, direct object and indirect object. For constructions, ECG formally specifies the ordering constraints operating over these constituents, as well as the linkage between form and schema (or to a node in an external ontology) considered to provide the meaning of the construction. Constructions are far more flexible than representations in traditional grammatical frameworks, and more able to cope with the diversity and flexibility of expression types, from single words to lengthy, multi-word expressions (MWEs). Constructions are also useful for representing conventional metaphorical wording and its meaning, and indeed some of ECG's extensive array of constructions hold such wording, with MWEs rather than individual words typically having specific, standard metaphorical meanings.

Dynamic Syntax (DS). DS is an implemented computational approach to language generation and understanding that is specially geared to dialogue (Purver et al. 2006, Cann et al. 2005, Kempson et al. 2001). Consider the clause Jack saw Jill, or even the synonymous Jill, Jack saw - DS models the parsing of such utterances as monotonic growth of information. Informally, for DS parsing (typically) starts by "guessing" quite reasonably that the goal of speaking is to utter a complete clause; parsing then proceeds by adding more and more information as lexical "stuff" is incrementally encountered, and extending the represented information accordingly. For example, the first word may be Jack, followed by saw, and finally Jill; alternatively, the first word may be Jill, followed by Jack, and lastly saw. Parsing proceeds step-by-step, until a fully specified output structure is determined for the complete utterance (thereby confirming the orignal "guess"). Interestingly, this approach does not mean DS is tied to modelling only complete clause utterances, quite the contrary. For example, dialogue is replete with all manner of utterances, so that in answer to the question Who did Jack see?, there are a range of valid replies, from one word types like Jill, all the way up to full clauses like Jack saw Jill. It turns out that DS is unique amongst grammar formalisms in being able to straightforwardly model the full range of utterances that characterise actual dialogue - for DS, all such answers provide essentially the same information, in context. Taking all of this into account directly leads to DS being one of the few generation systems that is (1) goal-directed, (2) fully incremental, and (3) context-dependent. We aim to combine the strengths of both DS and ECG, adapting the parsing/generation aspects of DS to work with the linguistic and conceptual representations used by ECG. In this way, the resulting system will perform ECG-style mapping between constructions and schemas, in an incremental and context-dependent way.

Stages of the project:

The project will progress through a series of stages, which have been designed in such a way that the completion of a stage represents a milestone for the project as a whole. There are six such "milestones", listed below. Broadly, the resulting architecture will consist of a pipeline of modules, from ATT-Meta to ECG to DS, in turn. The resulting output will be the incremental generation of metaphors, expressed naturally and appropriately in context.

  1. To kick-start the pipeline from conceptual to linguistic levels, a store of ATT-Meta logical forms will be developed from example ECG schemas.

  2. Next, the ATT-Meta-to-ECG encoding is employed to pipe conceptual content from ATT-Meta to the conceptual processing mechanism in ECG.

  3. Within ECG, conceptual schemas are linked to linguistic constructions, these latter are then passed to the DS module.

  4. Outside the central processing modules, appropriate patterns of metaphorical expressions are mined from corpora of metaphorical expressions.

  5. The combination of stages (3) and (4) inputs to the DS module, specifically its generation component, the resulting output being conventional metaphorical expressions.

  6. Finally, the output of stage (5) is then fed back into ATT-Meta, to take advantage of its unique capabailities for extending metaphorical meanings, in order to go beyond conventional metaphorical expressions.


  1. Barnden, J. (2008). "Metaphor and artificial intelligence: Why they matter to each other." In R.W. Gibbs Jr. (Ed.), The Cambridge handbook of metaphor and thought. Cambridge: Cambridge University Press. Pp. 311-338.

  2. Barnden, J. (2008). "Metaphor and context: A perspective from articial intelligence." In A. Musolff & J. Zinken (Eds.), Metaphor and Discourse. Basingstoke: Palgrave Macmillan. Pp. 79-94.

  3. Bryant, J.E. (2008). Best-Fit Constructional Analysis. Ph.D. thesis, Uni. of California, Berkeley.

  4. Cann, R., R. Kempson & L. Marten (2005). The Dynamics of Language. Oxford: Elsevier.

  5. Croft, D. & D. Cruse (2004). Cognitive Linguistics. Cambridge: Cambridge University Press.

  6. Feldman, J. (2010). "Embodied Language, Best-Fit Analysis, and Formal Compositionality." Physics of Life Review 7(4):385-410.

  7. Kempson, R., W. Meyer-Viol & D. Gabbay (2001). Dynamic Syntax. Oxford: Blackwell.

  8. Purver, M., R. Cann & R. Kempson (2006). "Grammars as Parsers: Meeting the Dialogue Challenge." Research on Language and Computation 4(2-3): 289-326.