It was constructed and programmed at IS Robotics. The primary work on IT occurred during the period of October 1994 to November 1994. Colin Angle came up with the concept for IT. (IT was meant to stand for Interactive Technology.) It was an internally funded research project at IS Robotics meant to demonstrate the feasability of using a robot with a human looking face and simple sensors to respond to its environment and communicate an internal emotional state. Possible applications include interactive animatronics and more physical interfaces for computers.
Some quick snap shots of IT.
IT also had audio output. Helen built a simple audio output board. This was one of the Radio Shack analog recording chips. She recorded me saying a number of simple phrases. It would grumble or cry as appropriate to match its emotions. To enhance the effect of the audio, LED's were placed in IT's mouth which would blink in response to what IT was saying. The controlling electronics was the same as that for a Hermes walking robot. This board contained a 68332 and was capable of driving up to 12 servos as well as having on board A/D for sensor input. For sensors it had three IR proximity sensors around its based, two microphones - one on each side of its head, CDS cells in its eyebrows, and pyro-electric (body heat) sensors in its eyes. All of the processing for IT was mounted on the robot. It was not autonomous, though, its power came from a power supply plugged into the wall. Also, an interface was provided to an outside computer for monitor the internal state of IT's software. It also had a pair of arms. The arms where actually the Kaa robot. This robot has two hyper-redundant six-degree-of-freedom-arms. When Colin mounted Kaa on IT's base, he also added an additional degree of freedom which allowed Kaa to be tilted up and down. This allowed IT to extend its arm to shake someone's hand. Kaa ran some simple code which listened to the commands from the main processor local area network and send data back to the main processor from the force sensors on Kaa. Rod Brooks did some very early experimentation with controlling the motors. After Rod's ealry experimentaion, I took over the system and did all of the programming. IT is programmed using Rod's behavior control paradigm. In the system many parallel threads (processes) are running and monitoring sensors and controlling actuators. If some outside event occurs, the sensor will process that data and send a message to another module, which will decide what to do and will finally send a command message to a motor control module. The software for IT maintained its internal emotional state. A number of state variables were used, including happiness, boredom, anger, sadness, etc... All of these variables ranged from -100 to +100. The value of these variables controlled IT's facial expressions. If IT was happy it would smile. If IT was angry, it would raise its eyebrows, frown and grumble. A number of modules filtered sensor data and would modify the emotional state. For example, IT became happier when people were around, so if it got a strong signal from its pyro sensors, that module would increase overall happiness. Also, it would become angrier if someone shined light in its eyes. The system also had a mechanism such that over time emotional variables would tend to decay towards zero. This allowed IT's happiness or anger to fade over time. Other degrees of freedom were controlled more directly by the sensors. For example, IT had a behavior that it would always look towards someone close to it. This behavior used the IR proximity sensors on the front of the robot to directly control the direction the eyes and the head pointed. Other behaviors can also take over control of the low level facial expression machinery from the emotionl state. This happens in IT's suprise behavior. In this behavior when someone gets too close to IT, it moves its head back, raises its eyebrows and opens its mouth in surprise. In this case the suprise behavior supresses the output of the emotional state monitoring behavior and controls the facial expression of the system for the duration of the behavior. It also had a number of other "schtick" which made it interesting. If it sensed a short pulse of high intensity light, it would assume someone had just taken a picture of it and it would smile and say cheese - a little late (after the picture has been taken) but cute nonetheless.
IT was an interesting experiment. It really helped drive home the point that robots are more interesting to people when the robots directly interact with the people and respond to people's actions. I found similar results when kids were exposed to Hoser. But, of course, in industrial applications this does not make these robots any more useful. However, in enterainment, education, and sales situations interaction robots can be a useful tool and should prove to be more effective than the standard non-interactive pre-programmed animatronics now used in those domains.
The press seemed to love IT. In December 1994 it was filmed with Rod Brooks and Alan Alda for episode #705, Robots Alive!, of the Scientific American Frontiers television show on PBS. It was also featured on the Popular Mechanics show. And the cover of National Geographic isn't too bad either. I think that all of this attention demonstrates the some of the success of this simple system. Is this AI? No, not quite. But what it did demonstrate is that people react very favorably to a system which seems to have some awareness of its environment and can directly respond to their actions.
IT on the cover of National Geographic Magazine