Boston Dynamics Inc. (BDI) creates automated computer characters and engineering simulations for things that move, such as humans, animals, robots and electromechanical devices. They specialize in dynamic simulation coupled to 3D computer graphics. They are converging on interactive characters from the graphics and simulation vector rather than the AI and personality vector.
Creatures are Alife pets that you raise from eggs. Their technology is distinctive in its level of biological modeling. A Creature has a neural net for action selection, artificial biochemistry (including hormonal effects on the neural net), an immune system, and a reproductive system (a genome encodes for creature traits).
Extempo Systems was founded by Barbara Hayes-Roth, leader of the Virtual Theater Project at Stanford. Extempo is creating architectures and authoring tools for the creation of improvisational characters. Their first demo is Erin the bartender, a character who serves drinks and chats with customers in a virtual bar.
Fin Fin is a half-bird half-dolphin creature who lives in a world called Teo. Users interact with Fin Fin via a microphone, proximity sensor and mouse. Fin Fin is shy; a user has to slowly build up a relationship with Fin Fin over time. Fin Fin utilizes technology developed by the Oz group at Carnegie Mellon.
The COLLAGEN project has built a toolkit to support the construction of agents that collaborate with humans in accomplishing tasks. This toolkit embodies principles from collaborative discourse theory.
A middleware toolkit for building multi-user social virtual worlds. SPLINE was used to build the experimental virtual world Diamond Park .
A multi-user virtual world inhabited by an autonomous agent named Mike.
This project represents MERL's research into interactive narrative (drama). The focus is on stories in which a main character learns and changes during the events in the story. The model of user interaction is that of the Greek Chorus (Larry Friedlander). The members of the chorus provide multiple perspectives on the unfolding events. The first story implemented, about the Montgomery Bus Boycott, is called Tired of Giving In.
Motion Factory is developing "Intelligent Digital Actor technology." Digital actors generate their own animation (motion) based on interactions with the environment. Motion Factory is an example of work converging on believable characters from the graphics community rather than the artificial intelligence community.
Super Wan-Chan is a virtual world containing four puppies that you raise to adulthood.
OZ Interactive makes 3D avatar worlds. They are building autonomous characters to inhabit their worlds. Early characters include Theresa, a moody expert on Greek mythology, and Angel, a helpful butler. From what I can tell, their agents are currently chatterbots.
The Persona project at Microsoft Research is developing the technologies required to produce conversational assistants-- lifelike animated characters that interact with a user in a natural spoken dialog. Their first prototype is Peedy, a character that responds to requests to play music. Gene Ball, a researcher in the Persona Project, organizes the conference Lifelike Computer Characters.
Dogz, Catz and Oddballz are autonomous pets that live on your screen.
Tamagocchi is a small, egg shaped plastic toy with an LCD screen and 3 buttons. Users must nurture a creature that lives on the screen by feeding it, giving it medicine, disciplining it, and cleaning up excrement. If the user is negligent in these tasks, the creature dies. This product is a craze in Japan. While Tamagocchi possesses neither sophisticated personality nor sophisticated behaviors, it is an example of the powerful effect (in terms of effect on users) of even a small amount of lifelike behavior.
Zoeisis was recently founded by Joseph Bates (head of the Oz project) and Oz project alumni. Its goal is to build interactive story experiences utilizing believable agents.
February 1997. Marina Del-Rey CA.
Elizabeth Andre, James Lester, Thomas Rist, Aki Takeuchi, Co-chairs. IJCAI 1997. Nagoya Japan.
Hiroaki Kitano, Joseph Bates, Co-chairs. AAAI 1994. Seattle, WA.
Joseph Bates, Barbara Hayes-Roth, Brenda Laurel, Nils Nilsson, Co-chairs. Spring Symposia 1994. Stanford University.
Hiroaki Kitano, Chair. AAAI 1996. Portland, OR.
Hiroaki Kitano, Joseph Bates, Barbara Hayes-Roth, Co-chairs. IJCAI 1995. Montreal, Quebec.
Joseph Bates, Barbara Hayes-Roth, Patti Maes, Co-chairs. Spring Symposia 1995. Stanford University
Gene Ball, Organizer. Last held October, 1996. Snowbird, UT.
Kerstin Dautenhan, Chair. Fall Symposia 1997. MIT, Cambridge MA.
Joseph Bates, Chair. AAAI 1990. Boston, MA.
A collection of links maintained by Craig Reynolds, famous for his creation of Boids, an artificial life simulation of flocking birds in which each bird's behavior is determined only by local rules. The collection is organized into the following categories: Characters, Creativity, Physically-Based Animated Figures, Games, Participatory Ecosystems, Robots, Agents, and Multi-Agent Systems.
A collection of references to AI models of emotion.
A list of researchers primarily engaged in realistic (as opposed to employing artistic abstraction) human modeling. The list hangs off some project entitled Computer Aided Theory of Consciousness: An Essay in Experimental Digital Philosophy (page written in German).
A group that promotes avatar spaces.
The Loebner Prize contest, held each year, awards $2000.00 to the author of the program which does the best job passing a limited form of the Turing test.
A group that explores the future of immersive, dramatic storytelling.
A page discussing research and commercial products related to virtual pets.
Led by Clark Elliott. The goal of this project is to build agents that can reason about emotion. Currently they have systems that can detect emotion in human voice, express emotion through facial expressions and speech inflection, and "have" emotions (in the sense that emotions detected in the user trigger emotions in the agent).
Home of Jack, a graphical human simulation package. The research at the Center is focused around building behavior and physics-based simulations of human figures.
Led by Rodney Brooks, the father of subsumption architecture. Rodney has been arguing for over a decade that the road to intelligence consists of building situated, embodied, broad agents (in his case, robots) which employ no semantic representations. Cog is a humanoid robot. As Cog interacts with the world using a body similar to a human body, it is hoped that Cog will learn to think the way humans do.
A project led by Aaron Sloman and Glyn Humphries. The goal of this project is to explore the design space of AI architectures in order to understand the relationship between what kinds of architectures are capable of what kinds of mental phenomena. They are interested in the whole range of human mental states; in particular they wish to discover whether emotions are an accident of evolution or fundamental to the design of any resource-limited intelligent agent.
Founded by Don Marinelli and Scott Stevens. They are charged with developing an entertainment technology program at CMU. Their current focus is Synthetic Interviews, an interactive video technology with which a user can have a conversation with some character.
Led by Justine Cassell. Using ideas from discourse theory and social cognition, this group designs agents which have discourse competence (e.g. knowing how to integrate gestures and speech to communicate, knowing how to take turns in a conversation, etc.).
This project is led by Ken Perlin and Athomas Goldberg . "The IMPROV Project at NYU's Media Research Lab is building the technologies to produce distributed 3D virtual environments in which human-directed avatars and computer-controlled agents interact with each other in real-time, through a combination of Procedural Animation and Behavioral Scripting techniques developed in-house." An example of convergence towards believable characters from the graphics side (vs. AI).
A project at the Media Lab led by Glorianna Davenport. They study techniques for bringing interactivity to the traditional cinematic medium (with notable exceptions such as Tinsley Galyean's Dogmatic, which is set in a virtual world). In general, this involves breaking down a linear medium (such as video) into a database of clips, somehow annotating those clips, and then intelligently choosing the right clips at the right time as a user interacts with the system. The video may be accompanied by other media such as email (Lee Morgenroth's Lurker).
Led by James Lester. This group focuses on intelligent multimedia. Currently they are focusing on animated pedagogical agents.
A robotics research lab, including remote-brained and humanoid robotics.
The home page for Julia, a chatterbot that lives in TinyMUDS.
Karl Wurst, in collaboration with the University of Connecticut's world-renowned Puppet Arts Program, is building robotic versions of the Woggles.
Led by Nadia Thalmann. This group works on virtual humanoids. Focus is on realistic modeling of human faces, movement, clothing, etc. Now starting to do work on autonomous systems.
Led by Paul Cohen. This group is building a baby that interacts in a simulated world. The goal is for the baby to learn the conceptual structure of the world through physical interaction.
Led by Joseph Bates, founder of Zooesis. The goal of the Oz project is to build interactive story worlds containing personality rich, believable characters. A drama manager ensures that the user experiences a high-quality story.
An alumnus of the MIT AI Lab, Phil Agre developed Pengi, a system which played the video game Pengo. Pengi is an instance of "alternative AI": it employed reactive behaviors and deictic (context dependent) representations. He has written elegantly on why classical AI is inappropriate for building agents which engage in situated, embodied, routine activity. He now teaches Communications at UC San Diego.
Primarily a philosopher of AI, Selmer also does research in story generation. His forthcoming book, AI, Story Generation and Literary Creativity: The State of the Art will describe BRUTUS, his latest story generation system.
A project led by Clifford Nass and Byron Reeves. They are studying the way people apply social rules and schemas to their interactions with technology.
Led by Patti Maes. The software agent group explores the use of autonomous agents in a wide variety of contexts. Much of their work tends to have an artificial life flavor (by which I mean that the work focuses on useful behavior emerging out of the interactions of many software agents). Agents as synthetic characters was explored by Bruce Blumberg in the ALIVE and Hamsterdam projects. The synthetic character work has how shifted to a new group being started by Bruce. He developed an ethologically motivated action selection mechanism to drive his synthetic characters.
Led by W. Lewis Johnson. This group has built a pedagogic agent named Steve that trains humans in virtual worlds. Steve teaches people how to perform tasks, gives advice as it watches users perform tasks, and answers student's questions.
Led by Barbara Hayes-Roth, founder of Extempo. The metaphor informing their work is that of an improvisational actor. That is, they build actors who try to improvise behavior in different situations. An actor's improvisational choices may be influenced by an explicitly specified personality (a set of values along some dimensions of personality). They are also exploring how a human might exert high level control over one of these actors.
They are building a humanoid robot including sensing, recognition, expression and motion subsystems.
On-line articles available about the OZ project. Articles include overall descriptions of the goals of the project, the action architecture, the emotion architecture, and natural language generation (for the text based worlds).
On-line articles from the Software Agents Group. Articles relevant to believable agents are listed under "Modeling Synthetic Characters: Applications and Techniques." Articles include descriptions of ALIVE, action-selection architectures, and the role of artificial life in entertainment.
On-line articles available about the Virtual Theater Project. Articles include descriptions of their approach to emotion, personality, and user control of improvisational puppets.
Cognitive Science 17, 1993.
The articles in this issue discuss the relationship between "alternative AI" (sometimes called behavioral AI, or situated action) and "classical AI." Simon and Vera wrote an article in which they argue that all of the specific work that falls under the rubric of situated action can not be construed as refutations of the physical symbol system hypothesis. Situated action is just a subset of symbolic AI which focuses on perception and motor control. The rest of the issue consists of articles written by various situated action proponents responding to Simon and Vera's article.
Phil Agre. A.I. Memo 1085. Artificial Intelligence Lab. MIT. October 1988.
Agre's Ph.D. thesis. Describes Pengi, a program that can play a video game called Pengo. Pengi is able to play the game without employing any traditional planning.
Phil Agre and David Chapman. A.I. Memo 1050a. Artificial Intelligence Lab. MIT. September 1988.
Argues for a view of plans as plans-for-communication (as opposed to the classic view of plans-as-programs).
Woody Bledsoe. AI Magazine, Spring 1986, pp 57-61.
Bledsoe describes the dream that brought him (and many AI researchers) into AI research in the first place: the dream of building computer companions.
Rodney Brooks. A.I. Memo 1293. Artificial Intelligence Lab. MIT. April 1991.
Argues for a situated, embodied, semantic-symbol-free approach to achieving intelligence in artificial systems.
Rodney Brooks. Robotics and Autonomous Systems 6, 1990. pp. 3-15.
Argues for a situated, embodied, semantic-symbol-free approach to achieving intelligence in artificial systems.
Paul R. Cohen, Marc S. Atkin, Tim Oates, and Carole R. Beal. Proceedings of the First International Conference on Autonomous Agents, pp. 170-177. Marina del Rey, CA. February 5-8, 1997.
Describes a simulated baby who learns concepts by "physically" interacting with a simulated world. This work comes out of the Neo project.
Antonio Damasio. Avon Books. 1994.
Describes recent research findings in neuropsychology which seem to indicate that emotion plays a fundamental role in human intelligence. Much of traditional cognitive psychology and artificial intelligence has assumed that emotion is not critical to understanding intelligence.
Lajos Egri. Simon and Schuster. 1946.
Describes how plays work via a theory which relates character, motive and story.
Clark Elliott. Proceedings of the First International Conference on Autonomous Agents, pp. 451-457. Marina del Rey, CA. February 5-8, 1997.
Describes an agent which communicates emotionally with people using speech recognition, text-to-speech conversion, real-time morphed schematic faces and music. This work comes out of the Affective Reasoning Project.
Masahiro Fujita and Koji Kageyama. Proceedings of the First International Conference on Autonomous Agents, pp. 435-442. Marina del Rey, CA. February 5-8, 1997.
Describes a standard defined by Sony Corporation for household entertainment robots.
Tinsley A. Galyean III. Ph.D. thesis, Media Arts and Sciences, MIT, June 1995.
Stephen Grand, Dave Cliff, and Anil Malhotra. Proceedings of the First International Conference on Autonomous Agents, Marina del Rey, California, USA, February 1997, pp 22-29.
Describes the architecture behind virtual pets which employ Alife technology (see Cyberlife).
Fumio Hara and Hiroshi Kobayashi. Proceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, Senri Life Science Center, Osaka, Japan, November 4-8, 1996, pp 1600-1607.
Describes a robot with a human-like face that can recognize and produce human facial expressions.
B. Hayes-Roth, R. van Gent, D. Huber. Proceedings of the AAAI Workshop on AI and Entertainment, 1996.
Describes a system that portrays a role change between a master and a servant. The master and servant improvise within the constraints of a script.
Masayuki Inaba, Ken'ichiro Nagasaka, Fumio Kanehiro, Satoshi Kagami, Hirochika Inoue. Proceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, Senri Life Science Center, Osaka, Japan, November 4-8, 1996, pp 15-22.
Describes a humanoid robot that can swing on a swing using visual tracking for control.
Chuck Jones. Farrar, Straus and Giroux. 1989.
The autobiography of Chuck Jones, an animator at Warner Bros. Describes the Warner Bros. approach to creating characters and story.
Margaret Kelso, Peter Weyhrauch, Joseph Bates. Presence: The Journal of Teleoperators and Virtual Environments, Vol. 2, Num. 1, MIT Press, Winter 1993.
Describes a series of live experiments to test the effect of interactive freedom on the dramatic experience. Also includes a description of plot graphs.
Brenda Laurel. Addison-Wesley, 1991.
Draws on Aristotle's theory of drama to define a new approach to designing dramatic human-computer interfaces.
Brenda Laurel. Ph.D. thesis, Drama department, Ohio State University, 1986.
Describes a hypothetical drama manager that guides an interactive story experience.
Michael Lebowitz. Poetics 14, 1985. pp. 483-502.
Describes the use of plan-like plot-fragments in UNIVERSE, a system that writes soap opera-like stories.
Michael Lebowitz. Poetics 13, 1984. pp. 171-194.
Describes the representations of characters in UNIVERSE, a system that writes soap opera-like stories.
James Lester and Brian Stone (IntelliMedia). Proceedings of the First International Conference on Autonomous Agents, pp. 16-21. Marina del Rey, CA. February 5-8, 1997.
Describes a competition-based behavior sequencing engine which produces life-like behavior while maintaining pedagogical appropriateness (e.g. don't distract a learner with some fancy behavior when they are problem solving).
A. Bryan Loyall. Ph.D. thesis, Tech report CMU-CS-97-123, Carnegie Mellon University, May 1997.
Describes requirements for believability derived from the character arts. These requirements motivate the description of Hap, an agent language designed to facilitate writing believable agents. The thesis then describes several examples of agents written in Hap. Finally, a method for doing believable, embodied natural language generation in Hap is described. This work is part of the Oz Project.
A. Bryan Loyall and Joseph Bates. Proceedings of the First International Conference on Autonomous Agents, pp. 106-113. Marina del Rey, CA. February 5-8, 1997.
Describes the integration of embodied natural language generation into a behavioral agent architecture. "We describe our approach, and show how it leads to agents with properties we believe important for believability, such as: using language and action together to accomplish communication goals; using perception to help make linguistic choices; varying generated text according to emotional state; varying generated text to express the specific personality; and issuing the text in real-time with pauses, restarts and other breakdowns visible." This work is part of the Oz Project.
Scott McCloud. HarperCollins. 1993.
Written in comic book form, this book describes the semiotics of comics.
James Meehan. Ph.D. Dissertation. Yale University. 1976.
Describes a system that generates Aesop fable-like stories. It generates stories by using planning to achieve the goals of characters.
W. Scott Neal Reilly. Proceedings of the First International Conference on Autonomous Agents, pp. 114-121. Marina del Rey, CA. February 5-8, 1997.
Describes a methodology for building social behaviors on a character-by-character basis. The philosophy behind this approach is that generic taxonomies of social behavior and personality are inappropriate for building believable characters. This work comes out of the OZ Project.
W. Scott Neal Reilly. Ph.D. thesis. Tech report CMU-CS-96-138, Carnegie Mellon University, May 1996.
Describes a system that maintains emotional state and a methodology for incorporating emotion into the behaviors of believable agents. The thesis then describes a methodology for building believable social behaviors. This work is part of the Oz Project.
Ken Perlin, Athomas Goldberg. Proceedings of SIGRAPH 96, pp. 205-216. New Orleans, LA. Aug. 1996.
Describes the interactive character architecture of the Improv project. An animation engine manipulates the control points of a graphical model. A behavior engine allows the user to specify higher level scripts which control the characters motions. The scripts are written in an English-like scripting language (reminiscent of HyperTalk).
Cladio Pinhanez. Proceedings of CHI97, Atlanta, GA, USA, March 22-27, pp 287-294.
Describes a method whereby interaction can be scripted with a temporal calculus that represents the relationships between intervals. A constraint propagation mechanism is used to determine the temporal value (past, now, future, or some mixed state) of each interval. Intervals can be associated with sensors and effectors.
Charles Rich and Candace L. Sidner. Proceedings of the First International Conference on Autonomous Agents, pp. 284-291. Marina del Rey, CA. February 5-8, 1997.
Describes a toolkit that supports the construction of agents who follow the rules of collaborative discourse. This work comes out of MERL.
Frank Thomas and Ollie Johnston. Hyperion. 1981.
Written by two Disney animators, this book describes the history of animation at Disney and what techniques the animators developed to make their characters seem believable. This book has been highly influential in the OZ Project at CMU.
Kristinn Thorison. PhD Thesis. MIT Media Laboratory, 1996.
Describes a system called Gandalf that models human dialog competence in order to communicate with a human using speech and gesture in realtime.
Peter Wavish and David Connah. Proceedings of the First International Conference on Autonomous Agents, pp. 317-322. Marina del Rey, CA. February 5-8, 1997.
Describes a script based architecture developed at Phillips Research Labs for controlling virtual characters.
J. Weizenbaum. Communications of the ACM 9(1):36-45, 1966.
Original paper describing ELIZA, a template-based pattern-matching program that simulates the conversational patterns of a non-directive therapist.
Peter Weyhrauch. Ph.D. thesis, Tech report CMU-CS-97-109, Carnegie Mellon University, January 1997.
Describes the Oz drama manager, a search-based system for guiding an interactive story experience. This work is part of the Oz project.