Richard D. Braatz (rdb@beethoven.che.caltech.edu) James S. Schwaber (schwaber@eslrx7.es.dupont.com) David Touretzky (dst@CS.CMU.EDU) Thomas F. Enders K. P. Unnikrishnan (unni@neuro.cs.gmr.com)
Panel participants: Martin Pottmann, DuPont
Babatunde A. Ogunnaike, DuPont
James Keeler, MCC, Austin
Michael A. Henson, Louisiana State University
Gerald Dreyfus, ESPCI, Paris
Francis J. Doyle, Purdue
A complementary workshop: "Open and Closed
Problems in Neural Network Robotics" organized by Marcus Mitchell
will be held on Saturday.
The presenters will represent three research groups with active research in the area of bio-control. This workshop promises to be thought-provoking with the aim of spending a substantial amount of the time on discussions. A detailed schedule of the workshop follows:
7:30 Dave Touretzky and A. David Redish summarize their
cognitive neuroscience theory of rodent navigation with
implications for hippocampal function, and its
implementation on a mobile robot.
8:00 discussion period for Touretsky/Redish presentation
8:15 Thomas F. Enders and collaborators summarize their
research efforts in using neural networks in the
development of techniques for the scheduling, control,
and on-line optimization of batch fermentation
processes (e.g. the alcoholic fermentation with yeast).
8:45 discussion period for Enders et al. presentation
9:00 panel/general discussion
4:30 James S. Schwaber, Richard D. Braatz, Francis J. Doyle,
Michael A. Henson, Martin Pottmann, and Babatunde
A. Ogunnaike summarize their research efforts in
developing novel process control techniques via
inspiration from the cardiorespiratory reflexes.
5:00 discussion period for Schwaber et al. presentation
5:15 other workshop attendees present their work
6:00 panel/general discussion
Pierre Baldi (pfbaldi@juliet.caltech.edu) Soren Brunak (brunak@cbs.dth.dk)
This workshop will concentrate on the presentation and discussion of the most recent results on the application of machine learning approaches to problems in computational molecular biology. Emphasis will be both on methodological issues and biological relevance.
Morning session will be devoted to DNA/RNA problems-- afternooon session wil be devoted to protein problems.
7;30 Pierre Baldi ``Hidden Markov Models of Human Genes'' 8:00 Soren Brunak ``Construction of Low Similarity Data Sets of Sequences with Funtional Sites for Prediction Purposes'' 8:30 Tim Hunkapiller TBA 9:00 Anders Krogh ``Predicting Protein Secondary Structure with Structured Networks'' 4:30 Paul Stolortz "Links between statistical physics and dynamic programming: applications to computational molecular biology" 5:00 Gary Stormo ``Neural Networks for the Identification of Functional Domains Common to Multiple Sequences'' 5:30 Niels Tolstrup ``Neural Network Model of the Genetic Code'' 6:00 Discussion
Thomas Petsche (petsche@scr.siemens.com) Stephen J. Hanson (jose@learning.siemens.com) Mark Gluck (gluck@pavlov.rutgers.edu)
In these examples and others, it can be cost effective to ``monitor'' the system of interest and signal an operator when the monitored conditions indicate an imminent failure. This is analogous to periodically glancing at the fuel gauge in your car to make sure you do not run out of gas.
An adaptive system monitor, therefore, is an adaptive algorithm that estimates the condition of the system from a set of periodic measurements. This task is typically complicated by the fact that the measurements are complex and high dimensional. Adaptation is necessary since the measurements will depend on the peculiarities of the system being monitored and its environment.
This workshop will focus on the use of novelty detection for the problem of system monitoring. A novelty detector is a device or algorithm which is trained on a set of examples and learns to recognize or reproduce those examples. Any new example that is significantly different from the training set is identified as ``novel'' because it is unlike any example in the training set.
The purpose of the discussion is to bring together researchers working on different real world monitoring tasks and those working on novelty detection algorithms in order to hasten the development of broadly applicable adaptive monitoring algorithms.
We expect presentations on several applications areas involving a variety of novelty detection algorithms:
7:30-9:00 Helicopter gearbox monitoring: Four presentations by Robert R. Kolesar (ONR), Kourosh Danai (U Mass), Peter Kazlas (U Colorado, Boulder) and Mark Gluck (Rutgers). 4:30-6:00 Engine and electric motor monitoring: Three presentations by Ken Marko (Ford), Scott Smith (Boeing), and Thomas Petsche (Siemens). Recognizing novelty in classification tasks: Two presentations by Germano Vasconcelos (University of Kent) and Dimitrios Bairaktaris (University of Stirling). (This list of speakers is preliminary and subject to change.)
Hynek Hermansky (hynek@eeap.ogi.edu) Misha Pavel (pavel@eeap.ogi.edu)
Instead, more traditional representations often remain more effective. This workshop addresses why biologically faithful front-ends do not couple well to current neural-based recognizers. We will discuss biological front ends, and alternatives particularly representations based on traditional engineering practice but modified to include what is known about human perception. We will also consider in what circumstances biological front ends do offer an advantage, and explore what directions recognition technology must take to make better use of these models.
The Workshop will be oriented towards extensive discussions. Several potential participants have interests in presenting short talks to stimulate the discussions, among them:
7:30 Jont Allen (Bell Laboratories, Murray Hill) "Speech Recognition with Human Face" 8:00 Andreou Andreas (Johns Hopkins University) "Analog Auditory Models" 8:30 Malcom Slaney (Interval Research): "Correlograms" 9:00 Discussion 4:30 Nelson Morgan (International Computers Science Institute and U C Berkeley): "Current Research in Stochastic Perceptual Auditory-event-based Models (SPAM) " 5:00 Chalapathy Neti (IBM Watson Center): "Neuromorphic speech processing for speech recognition in noisy environments."
Joseph Sirosh (sirosh@cs.utexas.edu)
The workshop will focus on collating the open questions and hypotheses about the functional role of intracortical connectivity, and formulating an agenda for computational and analytical modeling. How do patterned lateral connections form and develop? What do the patterns of lateral connectivity tell us about information stored in the cortex? How could associatory information in the lateral connections be expressed during cortical processing? How could lateral connections mediate learning processes in the cortex? What is their role in cortical plasticity? What types of neural network models are best suited for addressing such questions?
7:30--7:55: Gary Blasdel Correlation between patterns of activity and
lateral connectivity in primate visual cortex
7:55--8:20: Terrence Sejnowski "Physiological Effects of Intrinsic Horizontal
Connections in Visual Cortex"
8:20--8:45: Jack Cowan "Geometric Visual Hallucinations and Lateral
Cortical Connections"
8:45--9:10: Dawei Dong "ADD: Associative Dynamical Decorrelation---
A Quantitative Theory of Self-Organization
and Function of Visual Cortex"
9:10--9:30: ***DISCUSSION***
4:30--4:55: Shimon Edelman: "Computational models of 3D object
representation in the visual cortex, and the
possible role of lateral connections"
4:55--5:20: Jonathan Marshall "Do lateral connections help stabilize
perception during occlusion events?"
5:20--5:45: DeLiang Wang "Lateral connections and coherent oscillations"
5:45--6:10: Joseph Sirosh "Cooperative self-organization of lateral
connections and feature detectors in the
visual cortex"
6:10--6:30: ***DISCUSSION***
Lei Xu (lxu@cs.cuhk.hk) Zhaoping Li (zhaoping@cs.ust.hk) Laiwan Chan (lwchan@cs.cuhk.hk)
The development of the study of learning and the understanding of visual processing facilitate each other. Recent years, a number of advances have been made in both in the two areas. For instance, in the area of unsupervised learning, (1) numerous algorithms for competitive learning, PCA learning, and self-organizing maps have been proposed; (2) several new theories and principles, like maximum coherence, minimum description length, finite mixtures with EM learning, statistical physics, Bayesian theory, exploratory projection pursuit, and local PCA, have been developed; (3) theories for unifying various unsupervised learning rules (e.g., multisets modeling learning theory) have been explored. In the area of visual processing, more knowledge is being gathered experimentally about how visual development can be preserved or altered by neural activities, neural transmitters/receptors, and the visual environment etc, providing the bases and constraints for various learning rules and motivating new learning rule studies. In addition, there has been more theoretical understandings on the dependence of the visual processing units on the visual input environment, supporting the rationality of unsupervised learning.
The purpose of this workshop is twofolds: (1) to summarize the advances on unsupervised learning and to discuss whether these advances can help the investigation on visual processing system; (2) to screening the current results on visual processing and to check if they can motivate or provide some hints on developing unsupervised learning theories. The targeted groups of participants are researchers working in either or both the area of learning and the study of visual processing.
Friday Lei Xu (chair) 7:30 "Time-Domain Solutions of Oja's Equations", John Wyatt and Ibrahim Elfadel (MIT) 7:50 "Kmeans Performs Newton Optimization", Leon Bottou (Neuristique Paris) and Yoshua Bengio (University of Montreal) 8:10 "Multisets Modeling Learning: An Unified Framework for Unsupervised Learning", Lei Xu (The Chinese University of Hong Kong and Peking University) 8:30 "Information Theory Motivation For Projection Pursuit", Nathan Intrator (Tel-Aviv University) 8:50 "The Helmholtz Machine", Peter Dayan (University of Toronto) 9:00 Discussion Zhaoping Li (chair) 4:30 "Predictability Minimization And Visual Processing", Juergen Schmidhuber (Technische Universitaet Muenchen) 4:50 "Non-linear, Non-gaussian Information Maximisation: Why It's More Useful", Tony Bell (Salk Institute) 5:10 "Understanding The Visual Cortical Coding From Visual Input Statistics", Zhaoping Li (Hong Kong University of Science and Technology) 5:30 "Formation Of Orientation And Ocular Dominance In Macaque Striate Cortex", Klaus Obermayer (Universitaet Bielefel) 5:50 "Putative Functional Roles Of Self-organized Lateral Connectivity In The Primary Visual Cortex", Joseph Sirosh (University of Texas at Austin) 6:00 Discussion Saturday Laiwan Chan (chair) 7:30 "Density Estimation with a Hybrid of Neural Networks and Gaussian Mixtures", Yoshua Bengio (University of Montreal) 7:50 "Learning Object Models through Domain-Specific Distance Measures" Eric Mjolsness (UCSD) and Steve Gold (Yale University) 8:10 "Auto-associative Learning of On-line Handwriting Using Recurrent Neural Networks", Dit-Yan Yeung (Hong Kong University of Science and Technology) 8:30 "Training Mixtures of Gaussians with Deficient Data", Volker Tresp (Siemens AG, Central Research) 8:50 "A Fast Method for Activating Competitive Self-Organizing Neural-Networks", George F. Harpur and Richard W. Prager (Cambridge University) 9:10 Discussion Lei Xu (chair) 4:30 "Neuromodulatory Mechanisms For Regulation Of Cortical Self-organization", Michael E. Hasselmo (Harvard University) 4:50 "Learning To Cluster Visual Scenes With Contextual Modulation", Sue Becker (McMaster University) 5:10 "Invisibility in Vision: Occlusion, Motion, Grouping, and Self-Organization" Jonathan A. Marshall (University of North Carolina at Chapel Hill) 5:30 "A Comparative Study on Receptive Filters by PCA Learning and Gabor Functions", Irwin King and Lei Xu (The Chinese University of Hong Kong) 5:50 "Detection of Visual Feature Locations with a Growing Neural Gas Network" Bernd Fritzke (Ruhr-Universitaet Bochum) 6:10 Discussion
Neural network models of natural language processing have mainly focused in recent years on lower-level processes, including learning of past tense constructions, pronunciation, and reading, although some approaches to parsing and learning of grammars have been attempted, with mixed results. In fact, the best results for larger grammars appear to have been achieved by hybrid approaches, while inductive learning techniques have been most successful on small, restricted grammars.
FRIDAY MORNING:
Introductions
7:30 AM Mitch Marcus: "Statistical approaches to NLP"
8:00 AM Gary Cottrell: "Neural net approaches to NLP"
Learning fsa's and pda's
8:30 AM Lee Giles "Learning a class of large finite state
machines with a RECURRENT network"
8:50 AM Sreerupa Das "Differentiable symbol processing and an
application to language induction"
Machine translation
9:10 AM Patrick Juola and James Martin: "Extraction of Transfer
Functions through Psycholinguistic Principles"
FRIDAY AFTERNOON:
Parsing
4:30 PM George Berg "Single Network Approaches to Connectionist Parsing"
4:50 PM Ajay Jain, "PARSEC: Let Your Network do the Walking, but Tell
it Where to Go."
5:10 PM Stan Kwasny: "Training SRNs to Learn Syntax"
5:30 PM Risto Miikkulainen "Parsing with modular networks"
Discussion
5:50 PM - 6:30 PM The assembled crew
SATURDAY MORNING
Word sense disambiguation/discovery/large text corpora
7:30 AM Hinrich Schuetze: "Unsupervised word sense disambiguation
for improved text retrieval"
7:50 AM David Yarowsky "A comparison of word sense disambiguation
algorithms"
8:10 AM Kevin Lund & Curt Burgess: "A model of high-dimensional
semantics from lexical co-occurrence"
8:30 AM Eric Brill "Statistical language processing: What are numbers
good for?"
Discussion
8:50 AM - 9:30 AM The assembled crew
SATURDAY AFTERNOON
Psycholinguistic modeling
4:30 PM Michael Gasser "Modular networks for language acquisition:
Why and how"
4:50 PM David Plaut "Learning arbitrary and quasi-regular mappings
in word reading with attractor networks"
5:10 PM Mark St. John "Practice makes perfect: The key role of
construction frequency in sentence comprehension"
5:30 PM Kim Plunkett (unconfirmed), "Learning the Arabic plural:
The case for minority default mappings in connectionist nets."
Discussion
5:50 PM - 6:30 PM The assembled crew
The neural network approach in medical information processing offers many advantages including:
Friday 7:30 Optimizing networks for Atlas guided segmentation of brain images Anand Rangarajan, Yale University 8:00 Neural Net Analysis of Solitary Pulmonary Nodules Armando Manduca, Mayo Foundation 8:30 Using Neural Networks for Semi-automated Pap Smear Screening Laurie J. Mango, MD, and James M. Herriman, Neuromedical Systems, Inc. 9:00 Automated design of optical-morphological structuring elements for Pap smear screening J. P. Sharpe, R. Narayanswamy, N. Sungar*, H. Duke, R. J. Stewart, L. McKeogh and K. M. Johnson, University of Colorado at Boulder and *California Polytechnic State University 4:30 Comparing the prediction accuracy of statistical models and artificial neural networks in breast cancer Harry B. Burke, MD, David B. Rosen, Philip H. Goodman, MD*, New York Medical College and *University of Nevada School of Medicine 5:00 Diagnosis of hepatoma by committee Bambang Parmanto and Paul Munro, University of Pittsburgh 5:30 Panel Discussion Saturday 7:30 Neural Networks for Nonlinear Processing of Biomagnetic/Bioelectric Signals Martin Schlang, Michael Haft, and Ralph Neuneier, Siemens AG - Corporate Research and Development 8:00 Neural Networks Distinguish Demented Patients from Elderly Controls Based on EEG Recordings Beatrice A. Golomb, MD, and Andrew F. Leuchter, MD, UCLA Department of Medicine and UCLA Neuropsychiatric Institute 8:30 Normal and Abnormal EEG Classification using Neural Networks and other techniques Ah Chung Tsoi, University of Queensland 9:00 Issues in Controlling Cardiac Chaos Gary W. Flake, Siemens Corporate Research 4:30 Prediction and Control of the Glucose Metabolism of a Diabetic Volker Tresp, John Moody* and Wolf-R"udiger Delong, Siemens and *Oregon Graduate Institute 5:00 Experiences in using neural networks for detecting coronary artery disease Georg Doffner, Austrian Research Institute for Artificial Intelligence and Department of Medical Cybernetics and AI, University of Vienna 5:30 Panel Discussion
This workshop will feature formal sessions, discussions, and a panel discussion aimed at understanding the dynamics, theoretical capabilities, and practical applicability of recurrent network. The panel discussion will focus on future directions of recurrent networks research.
Friday Morning Applications: Mahesan Niranjan (chair) Opening Lee A. Feldkamp (Remarks on Time-Lagged RNN--Training and Applications) Jerry Connor (bootstrap methods in time series prediction) Paul Muller (Programmable Aanlog Neural Computer: Design and Performance) Lee Shung (Learning with smoothing Regularization) Manuel Samuelides (application: design of neuro-filters) Gary Kuhn (application of sensitivity analysis) Morten With Pederson (Training and Pruning) Afternoon Neural Architectures: Feldkamp (Chair) Paolo Frasconi (Learning and rule embedding) Lei Xu / Chan (Mixture models and the EM Algorithm) Hava Siegelmann (Towards a Neural Language: symbolic to analog) General discussion Saturday Morning Dynamics and Biology based: Baldi (Chari) Pierre Baldi (Trajectory Learning Using Shallow Hierarchies of Oscillators.) Mahesan Niranjan (Stacking Multiple RNN models of the Vocal Tract) Kenji Doya (Problems concerning bifurcations of network dynamics) Hugo deGaris (The CAM-Brain Project : Evolution of a Billion Neuron Brain ) Dawei Dong (associative dynamic decorrelation) Afternoon Fundamentals Siegelmann (Chair) Yoshua Bengio (on the problem with learning long-term dependencies) Barak Pearlmuter (5 minute) (note on the alleged difficulty of learning over long time.) Ricard Gavalda (on the Kolmogorov complexity of RNN) Panel discussion
For all the research attempts to apply neural network ideas to robotics, it is still difficult to get clear answers to questions like "Can you use a neural network to control a 6 d.o.f. arm?" or "Do reinforcement learning learning and dynamic programming methods get killed by the curse of dimensionality?" In addition, robotics is an area with a vast and intimidating "non-neural" literature which must be considered. The main goal of this workshop is to stimulate discussion about what problems have been successfully attacked and what the most important current open problems entail. A secondary goal of the workshop is to produce a short consensus list of problem descriptions and their status.
A complementary workshop, titled "Novel Control Techniques from Biological Inspiration", organized by Jim Schwaber et al., may be of interest to participants. None of the presentations in that session will be on robotics, and its main focus will be on nonlinear dynamical systems, e.g. in chemical processes and in neural systems. It is a one day workshop to be held Friday.
7:30 - 7:35 Opening Remarks
Marcus Mitchell, Caltech
7:35 - 8:00 Why it's harder to control your robot than your arm:
closed, open and irrelevant issues in inverse kinematics
Dave Demers, UCSD
8:05 - 8:30 Open Problem: Optimal Motor Hidden Units
Terry Sanger, JPL
8:35 - 9:00 Neural Network Vision for Outdoor Robot Navigation
Dean Pomerleau, CMU
4:30 - 4:55 Learning New Representations and Strategies
Chris Atkeson, Georgia Tech
5:00 - 5:25 A Semi-Crisis for Neural Network Robotics:
Formal Specification of Robot Learning Tasks
Andrew Moore, CMU
5:30 - 6:30 Closing Discussion
Andrew D. Back (back@elec.uq.oz.au) Eric A. Wan (ericwan@eeap.ogi.edu)
7:30-7:45
Opening Discussion - Andrew Back, University of Queensland
7:45-8:00
"Computational Capabilities of Local-Feedback Recurrent Networks"
Paolo Frasconi
University of Florence, Italy
8:00-8:15
Issues in Representation: Recurrent Networks as Sequential Machines
C. Lee Giles and B.G. Horne
NEC Research Institute
8:15-8:30
"Properties of Recursive Memory Structures"
Jose C. Principe
University of Florida
8:30-8:45
"A Local Model Net Approach to Modeling Nonlinear Dynamic Systems"
Roderick Murray-Smith
MIT
8:45-9:15 Open forum: 5 minute presentations^{*} by participants
9:15-9:30 Question Time and Discussion
4:30-4:45
"A Spatio-Temporal Approach to Visual Pattern Recognition"
Lokendra Shastri
ICSI
4:45-5:00
"The Performance of Recurrent Networks for Classifying Time-Varying Patterns"
Tina Burrows and Mahesan Niranjan
Cambridge University Engineering Department
5:00-5:15
"Nonlinear Infomax With Adaptive Time Delays"
Tony Bell
The Salk Institute
5:15-5:30
"The Sinc Tensor Product Network"
Jerome Soller
University of Utah
5:30-5:45
"Discriminating Between Mental Tasks Using a Variety of EEG Representations"
Chuck Anderson
Colorado State University
5:45-6:00 Open forum: 5 minute presentations^{*} by participants
6:00-6:30 Question Time and Closing Discussion
* Free time slots for 5 min 'soap-box' talks will be available at the
workshop.
High dimensional spaces have (asymptotic) properties that are nonintuitive when considered from the perspective of the two- and three-dimensional cases generally used for visual examples. Because of this fact, algorithm design in high dimensional spaces can not always be done by simple analogy with low dimensional problems. For example, a radial basis network is intuitively appealing for a one dimensional regression task; but it must be used with care for a 100 dimensional space and it may not work at all in 1000. Thus having a familiarity with the nonintuitive properties of high dimensional space may lead to the development of better algorithms.
We will discuss the issues that surround successful nonlinear regression estimation in high dimensional spaces and what we can do to incorporate these techniques into other algorithms and apply them in real-world tasks. The workshop will cover topics including the Curse of Dimensionality, Projection Pursuit, techniques for dimensionality reduction, feature extraction techniques, statistical properties of high dimensional spaces, local methods and all of the tricks that go along with these techniques to make them work.
7:30 "Statistical Properties of High Dimensional Spaces"
Michael Perrone (IBM T.J. Watson Research Center)
8:00 "Computational Learning and Statistical Prediction"
Jerome Friedman (Stanford University)
8:30 "Discriminant Adaptive Nearest Neighbor Classification"
Trevor Hastie and Rob Tibshirani (Stanford University)
9:00 "Local Methods in High Dimension: Are They Surprisingly Good But Miscalibrated?"
David Rosen (New York Medical College)
Afternoon
---------
4:30 "Is There Anything Positive in High Dimensional Spaces?"
Nathan Intrator (Tel Aviv University)
5:00 "Three Techniques for Dimension Reduction"
John Moody (Oregon Graduate Institute)
5:30 "A Local Linear Algorithm for Fast Dimension Reduction"
Nandakishore Kambhatla (Oregon Graduate Institute)
6:00 "Fuzzy Dimensionality Reduction"
Yinghua Lin (Los Alamos National Lab)
A wide range of different approaches have been developed to tackle inverse problems, and one of the main goals of the workshop is to contrast the way in which they address the underlying technical issues, and to identify key areas for future research. Ample time will be allowed for discussions.
7:30 "Welcome and overview" Chris Bishop (Aston)
7:35 "From ill-posed problems to all neural networks and beyond
through regularization" Tomaso Poggio / Federico Girosi (MIT)
7:55 "Solving inverse problems using an EM approach to density
estimation" Zoubin Ghahramani (MIT)
8:15 "Density estimation with periodic variables" Chris Bishop (Aston)
8:35 "Doing it forwards, undoing it backwards: high-dimensional
compression and expansion" Russell Beale (University of Birmingham)
8:55 "Inversion of feed-forward networks by gradient descent"
Alexander Linden (Berkeley)
9.15 Discussion
4:30 "An iterative inverse of a talking machine" Sid Fels (Toronto)
4:50 "Diagnostic problem solving" Sungzoon Cho (Postech, S Korea)
5:10 "Multiple Models in Inverse Filtering of the Vocal Tract"
M Niranjan (Cambridge)
5:30 "Goal directed model inversion" Silvano Colombano (NASA Ames)
5:50 "Predicting element concentrations in the SSME exhaust plume"
Kevin Whitaker (University of Alabama)
6:10 Discussion
Morning ``Control of Phase-lags in Central Pattern Generator'' Bard Ermentrout, Department of Mathematics University of Pittsburgh, Pittsburgh, PA 15260 (bard@mthbard.math.pitt.edu) ``A Model for the Locomotor Network in Lampreys'' James T. Buchanan, Department of Biology, Marquette University, Milwaukee, WI 53233 (6231buchanan@vmsf.csd.mu.edu) ``Artificial Life Approaches to Building Sensorimotor Systems'' Peter M. Todd, Department of Psychology, University of Denver (ptodd@smith.rowland.org) Afternoon ``How Can a Bird Sing a Song It Heard?'' Kenji Doya (in collaboration with Terry Sejnowski) ATR Human Information Processing Research Laboratories Howard Hughes Medical Institute The Salk Institute (doya@salk.edu) ``Modeling Chemotaxis in the Nematode C. elegans'', Shawn Lockery, University of Oregon and Steve Nowlan, Synaptics (nowlan@synaptics.com) ``Sensorimotor Integration in a Computational Model of the Articulatory and Phonetic Foundations of Childhood Phonology'' Kevin L. Markey, Department of Computer Science, University of Colorado, Boulder(markey@tigger.cs.colorado.edu)