15 November 1995, 12:00, WeH 7220 Rehearsal and pseudorehearsal as solutions to the catastrophic forgetting problem Anthony Robins (U. Otago, New Zealand) How do you incorporate new information in to an existing artificial neural network (ANN)? In most ANNs learning new information results in the "catastrophic forgetting" (significant disruption or complete loss of) old information. This problem makes typical ANNs very inflexible once trained. In this talk we describe a solution to this problem based on rehearsal, the retraining of old information as the new information is added. This forces new information to be integrated in to the information already stored in a network rather than simply overwriting it. We also show that it is possible to achieve this effect even when we don't have the old information (original training population) to rehearse. This "pseudorehearsal" mechanism approximates the old information by sampling the network with randomly constructed "pseudoitems". Both rehearsal and pseudorehearsal methods increase the flexibility of ANN learning by allowing new information to be integrated into an existing network with minimum disruption of old information.