E-Drama Resources

 

 

1. Designing autonomous human-like conversational avatars

2. Emotion modeling and detection

3. Interactive drama system

4. Conversational acts and rules

5. Books

6. Other related papers

7. Useful links

 

 

 

 

1. Designing autonomous human-like conversational avatars

 

  1. BodyChat: Autonomous Communicative Behaviors in Avatars. Vilhjalmsson, H. H. and Cassell J. Proceedings of the 2nd Annual ACM International conference on Autonomous Agents. 1998. (Note: This paper designs autonomous conversational avatars using context analysis and discourse theory to implement avatars’ distance and close salutation, gesture etc. The system reported allows users to communicate via text while avatars automatically animate attention, salutations, turn taking, back-channel feedback and facial expression, etc. This implementation can be quite useful for our eDrama project.)

 

  1. The Power of a Nod and a Glance: Envelope vs. Emotional Feedback in Animated Conservational Agents. Cassell, J. and Thorisson, K. R. Applied Artificial Intelligence. Vol 13. Page: 519-538. 1999. (Note: This paper presents the importance of animated agents’ envelope feedback such as salutation, gesture, gaze and lifting eyebrow, etc. Three groups of different feedback (content-related only, content + envelope feedback and content + emotional feedback) have been tested. Their results confirm that the envelope feedback plays a significant role in supporting the process of dialog. Envelope feedback is even more crucial than emotion feedback, which is true to their implementation of the system and this point is worthwhile being considered for the implementation of our eDrama system.)

 

  1. Expressive Characters and a Text Chat Interface. Gillies, M., Crabtree, B. and Ballin, D. Proceedings of AISB 2004 Symposium: Motion, Emotion and Cognition. 29 March – 1 April 2004, Leeds, UK. (Note: Gillies et al. provide life-like autonomous avatars with non-verbal communicative behavior with some high level control from the user. Demeanour framework with a text chat interface has been developed to generate non-verbal behavior for avatars. Three non-verbal behaviors: posture, gesture and eye gaze have been focused on. The important aspect of Demeanour user interface is that commands, which result in actions, change the state of the avatars in the long run by updating avatars’ profiles. The non-verbal behavior contributes to flow of conversation as well.  

Back

 

 

2. Emotion modeling and detection

 

Papers:

 

  1. Emotion Dialogue Simulator. Wiltschko, W. R. eDrama learning, Inc. eDrama Front Desk. (Note: This paper introduces theoretical background and high-level description of the eDrama Front Desk system. It includes general description of implementations of natural language parser, dialogue management and emotion modeling. Emotion model used in this system is derived from OCC model based on two factors: respect and power. Though it is short of very important details for implementation, some of the ideas are very developmental, which are very useful for the development of our system. (emotion reasoning and emotion generation))

 

  1. Speech Emotion Recognition Using Hidden Markov Models. Nogueiras, A., Moreno, A. Bonafonte, A. and Marino, J. B. Proceedings of European Conference on Speech Communication and Technology (Eurospeech 2001). (Note: This paper presents a first approach to emotion detection using speech features: energy features and pitch features. This approach is based on standard speech recognition technology using hidden Markov models (HMM). The accuracy rate is even over 80% using the best combination of speech features and HMM. The results are very similar to that obtained from human judges.)

 

  1. Multiple Level Representation of Emotion in Computational Agents. Davis, D. N. Proceedings of AISB2001: Agents and Cognition. Emotion, Cognition and Affective Computing. University of York, March 2001. (Note: This paper reports some investigation into the computational modeling of emotion and motivation. The work presented mainly focuses on the use of cellular automata to model emotion engine. Experiments are based on simple multi-agent scenarios. In general, it is not very useful.)

 

  1. Modeling Character Emotion in an Interactive Virtual Environment. Mehdi, E. J., Nico P., Julie D. and Bernard P. Proceedings of AISB 2004 Symposium: Motion, Emotion and Cognition. 29 March – 1 April 2004, Leeds, UK. (Note: Mehdi, Nico, Julie and Bernard report a virtual reality training tool for fireman with emotion modeling. They have considered personality, mood and emotion for the generation of emotional behavior and used four basic emotion categories proposed by OCC model: satisfaction, disappointment, anger and fear. Widely accepted Five Factor Model is used for personality modeling. Discussion and equations are provided for the final generation of emotion. (emotion reasoning and emotion generation))    

 

  1. Machinery for Artificial Emotions. Botelho, L.M. and Coelho, H. 2001., Journal of Cybernetics and Systems 32(5):465-506. (Note: Botelho and Coelho discuss emotion in the perspective of the Salt & Pepper Project. This paper presents four main contributions to the study and use of emotions in the field of artificial intelligence. First of all, the main difference between affective appraisal and cognitive appraisal has been discussed. Then it introduces an original proposal with respect to the classification of emotions. Next, it presents a concrete implementation and use of emotion within Salt & Pepper architecture. Finally, several examples of Salt & Pepper Project have been reported. (emotion reasoning and emotion generation))    

 

  1. A Model for Personality and Emotion Simulation. Egges, A., Kshirsagar, S. and Magnenat-Thalmann, N. 2003. Knowledge-Based Intelligent Information & Engineering Systems. (Note: The authors describe a prototype based on the OCC model and personality model in combination of a dialogue environment and a 3D talking head with lip synchronized speech and facial expression. First of all, OCC is used to reason emotional states, which are then selected by personality model.  The personality model is based on five dimensions (openness, conscientiousness, extraversion, agreeableness, neuroticism). Finally, an intelligent agent creates behavior according to the generated emotional state, mood (one dimension) and the personality. Only a small interactive system is developed so far. Their future work will focus on validation of their current emotion model and extension of it by having multiple mood dimensions. (emotion reasoning and emotion generation))

 

  1. Integrating Affective Computing into Animated Tutoring Agents. Elliott, C., Rickel, J. and Lester, J. 1997. IJCAI’97 Workshop on Intelligent Interface Agents. (Note: The authors demonstrate two tutoring systems: the Soar Training Expert for Virtual Environments (Steve) and the Design-a-Plant (Herman the Bug) system with integration of Affective Reasoner. Virtual tutors with personality, emotional responsiveness and affective user modeling are provided in the two integrated systems. For the convenience of exposition, Steve is introduced as a platform for discussion of application of personality and emotional responsiveness in tutors, while Herman is presented as a platform for discussion of affective user modeling. In fact, these two kinds of emotion reasoning are all included in both systems. (emotion reasoning and emotion generation))   

 

  1. A Domain-Independent Framework for Modeling Emotion. Gratch, J. and Marsella, S. 2004. Journal of Cognitive Systems Research, (in press). (Note: The authors produce an appraisal and coping integrated computational framework to reason emotions and provide emotional responses for virtual agents. Their main contribution is the introduction of computational description of emotional coping for the first time. Moreover, close relationships between cognition, appraisal and coping responses are discussed as well. EMA (Emotion and Adaptation) is an implementation model of their proposed emotion process framework. The appraisal function of EMA is built based on Emile system, which blends several psychological theories, but mainly based on OCC model and the work of Lazarus. Though the proposed theory and implemented models are efficient in a number of ways, Gratch and Marsella emphasis that their framework is more suitable for deliberative reasoning, but not suitable for perception and non-deliberative reasoning. Moreover, the use of a direct influence bridge between appraisal and coping is a controversy, since this is not proved from psychological point of view. (emotion reasoning and emotion generation)) 

 

  1. Simulating Affective Communication with Animated Agents. Prendinger, H. and Ishizuka, M. 2001. Proceedings of the Eighth IFIP TC.13 Conference on Human-Computer Interaction (INTERACT 2001), Tokyo, Japan, pages 182-189. (Note:  The authors present a model of interaction between users and animated agents as well as between inter-agents. As essential requirements for animated agents’ ability to perform affective communication, they reason about emotions and emotion expression based on OCC model, personality and social role awareness. The main contribution of their system is that filter programs based on personality and social role awareness are used to obtain final emotion expression. This makes implementations of a computational model of affective communication capable of providing natural responds and social behavior. (emotion reasoning and emotion generation))   

 

  1. Emotion Model for Life-like Agent and Its Evaluation. Ushida, H., Hirayama, Y. and Nakajima, H.1998. Proceedings of the Fifteenth National Conference on Artificial Intelligence, 62-69. Menlo Park, Calif: AAAI Press. (Note: The authors propose an emotion model for life-like agents with emotions and motivations. Both reactive and deliberative mechanisms are included in this model. The former provides low-level responses to external stimuli from both the real world and virtual worlds. The latter mainly processes emotions. The emotion model is also derived from cognitive appraisal theory. In it, cognitive and emotional processes interact with each other based on the theory. Additionally, there is also a learning mechanism in the model to provide diversified behaviors. An interactive virtual world has been used to test the model with virtual characters. Three groups of results are discussed and evaluated as well. (emotion reasoning and emotion generation))  

 

  1. Tears and Fears: Modeling Emotions and Emotional Behaviors in Synthetic Agents. Gratch, J. and Marsella, S. Proceeding of ACM Agents'01, pp.278-285, Montreal, Canada, May28-June 1, 2001.

 

  1. A Step Toward Irrationality: Using Emotion to Change Belief. Marsella, S. and Gratch, J. Proceeding of First International Joint Conference on Autonomous Agents and Multi-Agent Systems, pp.334-341, Bologna, Italy, July 15-19, 2002. (Note: This paper proposes a single unified emotional modeling based system by combing several complementary approaches, namely appraisal-driven and communication-driven approaches. For instance, Gratch's Emile system based on emotional appraisal, Marsella's system on the impact of emotions on behaviors, such as impact on the physical expressions of emotional state through suitable choice of gestures and body languages.)

 

  1. Behaviors That Emerge from Emotion and Cognition: Implementation and Evaluation of a Symbolic-Connectionist Architecture. Henninger, A. E., Jones, R.M., and Chown, E., Proceeding of Second International Joint Conference on Autonomous Agents and Multi-Agent Systems, pp.321-328, Melbourne, Australia, July 14-18, 2003. (Note: This paper targets at the design of framework for modelling emotions in agent's decision-making. It proposes a connectionist-style model to integrate the factors affecting emotions and factors.)

 

  1. Emotion Recognition and Its Application to Computer Agents with Spontaneous Interactive Capabilities. Nakatsu, R., Nicholson, J. and Tosa, N. Proceeding of Creativity & Cognition 1999, pp. 135-143, Loughborough, UK. (Note: This paper proposes the use of neural network to perform emotion recognition in speeches, which are from a large speech database that contain emotions. It uses one NN to recognize one specific emotion. It also describes the use of such system in real time human-agent interaction (just for some reference, I doubt this can be useful).)

 

  1. More Realistic Human Behavior Models for Agents in Virtual Worlds: Emotion, Stress, and Value Ontologies. Silverman, B. technical report, University of Pennsylvania, http://hms.upenn.edu/publications.html, last accessed 04/2004. (Note: This paper gives details description on system architecture for multi-agent environment, each with emotion, personality, etc. It contains lots of useful references on emotion based agent, interaction method etc. But, eDrama differs from a traditional intelligent-based simulation environment in that it has a director; a story must be followed, so that agent architecture, interaction model, and scene management model would be different.)

 

  1. Dynamically Altering Agent Behaviors Using Natural Language Instructions. Bindiganavale, R., Schuler, W., Allbeck, J., Badler, N., Joshi, A., and Palmer, M. Autonomous Agents, pp. 293-300, Barcelona, Catalonia, Spain. June 3 - June 7, 2000. (Note: This is a typical research outcome from this simulation center in the University of Pennsylvania. It uses natural language instruction to change some emotion, personality based agents' behaviors, could be a useful reference for our eDrama bitpart character model.)

Back

 

 

 

3. Interactive drama system

 

  1. Interactive Pedagogical Drama (Carmen's Bright IDEAS). Marsella, S.C., Johnson, W.L., and Labore, C. Proceeding of Agents 2000, Barcelona, Spain, 2000. Carment's Bright IDEAS provides useful references on how several components including emotion based agent, story, dialogue generation can be combined together to produce an virtual drama system and the use of the cinematography agent is considered as an effective approach to enhance dramatic experience in the virtual drama. User interaction is limited by selecting some pre-defined possible answers (via pop-up dialogue bubbles) instead of user utterances via typed in text. Animation is limited to 2D cartoon style.

 

  1. Bringing drama into a virtual stage (Teatrix). Machado, I., Prada, R., and Paiva, A., , Proceedings of the third international conference on collaborative virtual environments (San Francisco, California, USA), pp111-117. Teatrix developed a practical distributed environment for multi-user virtual drama system along with a intelligent agent architecture based on goals and emotions. Although simple 3D environment is used in the system, the character is represented by a 2D image. Directorial intervention was used to give "hints" to user in order to progress the drama story. However, detailed working mechanism is not clear.

 

  1. Interactive Drama on Computer: Beyond Linear Narrative. Szilas,N. In Proceeding of AAAI Fall Symposium on Narrative Intelligence, Massachusetts, USA, November 5-7,1999.These papers argue that a so-called narrative logic should replace the logic of character. Narrative logic differs from the character's logic in that it only concerns with significant actions from a narrative perspective. However, only very vague and high level system and agent architecture were described in the two papers.

 

  1. Towards an Architecture for Intelligent Control of Narrative in Interactive Virtual Worlds (Mimesis), Young, R.M. and Riedl, M.(2003) , , In the Proceedings of the International Conference on Intelligent User Interfaces, January, 2003. Mimesis provides very useful references on a server-client based distributed drama structure with a high modularized system and an effective way to handle user activity . However, their proposed system is entirely based on Unreal Game Engine whose animation engine may not has the enough flexibility to deliver complex emotion based character behaviours, e.g. automated character in e-drama.

 

  1. Building an Interactive Drama Architecture. Magerko, B. and Laird, J. E. 1st International Conference on Technologies for Interactive Digital Storytelling and Entertainment, Darmstadt, Germany, March 24 - 26, 2003. This paper used a state-baed story representation for virtual drama. User is allowed to execute flexible activities that may be conflict with the desired story progress. A automated Director agent is responsible for monitoring and categorizing users' behaviours.

  1. A Study into the Believability of Animated Characters in the Context of Bullying Intervention. (VICTEC) Woods, S., Hall, L., Sobral, D., Dautenhahn, K. and Wolke, D. Proceedings of the 4th International Working Conference on Intelligent Virtual Agents - IVA 2003. (Note: This paper presents an interactive virtual learning environment with intelligent virtual agents – VICTEC (Virtual ICT with Empathic Characters), which aims to apply synthetic characters and emergent narrative for helping children aged 8-12 to cope with bullying in a virtual school. They provide the description of the VICTEC demonstrator and the trailer script. Experimental design and results are discussed as well. But the physical aspects of the characters are regarded as unbelievable, unrealistic and even jerky by most users. Additionally the implementation of the characters’ emotional behavior is based on predefined animations and pre-recorded audio. The facial expression for the characters is obtained through the use of dynamically modified textures. These are the main differences between VICTEC and our eDrama project, which will detect emotion in the context of conversation.)

 

  1. Interactive Drama, Art and Artificial Intelligence (Facade). Mateas, M. 2002. Ph.D. Thesis. School of Computer Science, Carnegie Mellon University. Façade, provides story level interaction (drama management), believable agents, and shallow natural language processing, in the context of a first person, graphical, real-time interactive drama. The system provides very useful references on the overall interactive drama architecture, an effective way to unfold drama story depending on previous user interactions and activity and a practical natural language processing technique .

 

  1. Toward the Holodeck: Integrating Graphics, Sound, Character, and Story (MREP), Swartout, W., Hill, R., Gratch, J., Johnson, W. L., Kryakakis, C., LaBore, C., Lindheim,R., Marsella, S., Miraglia, D., Moore, B., Morie, J., Rickel, J., Thiebaux, M., Tuch, L., Whitney, R., Douglas, J. 2001. . In Proceedings of the 5th International Conference on Autonomous Agents . Emotion model for the character in MREP’s is highly relevant to the design of the autonomous agents in e-drama. In addition, the story-net method seems to be a practical way to manage the drama unfolding.

 

  1. Character-based Interactive Storytelling, Cavazza, M., Charles, F. and Mead, S.J., (2002b). . IEEE Intelligent Systems, special issue on AI in Interactive Entertainment, pp. 17-24. This system concentrates on generation of automonous behaviours of the character in respond to events/actions generated from users. Hierarchical tasks network is used to carry out action planning(behaviours selection) for the character.

 

Back

 

 

 

4. Books:

 

  1. The Cognitive Structure of Emotions. Ortony, A., Clore, G.L. and Collins, A. Cambridge University Press, 1988. (Note: This book presents a systematic and detailed account of the cognitive generation of emotions. The authors propose three aspects of the world to which we can react emotionally: events of concern to us, actions of those we consider responsible for such events, and objects. These aspects lead to three classes of emotions. The authors characterize a wide range of emotions and offer concrete proposals about the factors that influence the intensity of each. This book will interest a wide audience in cognitive, clinical, and social psychology, as well as in artificial intelligence and other cognitive science disciplines.

 

Back

 

 

 

5. Conversational acts and rules

 

  1. http://www.msu.edu/user/mckeonm/PHL130/Chapter1.html

 

  1. Conversational Maxims and Principles of Language Planning.  Traunmüller, H. PERILUS XII, Page: 25-47. Department of linguistics, Stockholm University. 1991. (Note: These two web pages contain principles, rules and acts of conversation, which are explained with examples from different point of view. They can be very useful for the production of dialogues for automatic director intervention and bit part character.

 

Back

 

 

6. Other Papers

 

  1. Beyond Shallow Models of Emotions. Sloman, A. Cognitive Processing. 2(1): 177-198. 2001. (Note: This paper suggests an approach for deeper explanatory theories of emotions and may other kinds of mental phenomena. The proposed architecture-based theory helps the author to distinguish (at least) primary emotions, secondary emotions, and tertiary emotions, and produce a coherent theory which provides explanations for a wide range of phenomena. The paper also argues that the significant difficulties that current intelligent systems suffer are due to the usage of the shallow emotion modeling.  And the theories suggested in the paper have linked many previous works by other researchers. )

 

  1. Knowledge Based Conversational Agents and Virtual Storytelling. Tarau, P. and Figa, E. Proceedings of the 2004 ACM Symposium on Applied Computing. Page: 39-44.

 

  1. Achieving Affective Impact: Visual Emotive Communication in Lifelike Pedagogical Agents. Lester, J. C., Towns, S. G. and Fitzgerald, P. J. International Journal of Artificial Intelligence in Education.10, Page: 278-291. 1999.

 

Back

 

 

7. Useful Links

 

  1. VICTEC (Virtual ICT with Empathic Characters)

 

  1. AISB 2004 Convention: Motion, Emotion and Cognition

http://www.leeds.ac.uk/aisb/

 

Back