I have worked both in the context of the Viper system, where I explore vision techniques for outdoor position estimation, and in the JavaBayes system, where I explore robust inferences with (Quasi-)Bayesian networks, both using local and global perturbations.

The estimation engine in the Viper system is based on a Quasi-Bayesian estimator; the idea is to evaluate robustness as an indication of where to spend computational effort. The estimator builds an occupancy map for the position of the robot; the catch is that the occupancy maps actually represents a full density ratio familiy of distributions which generate both the estimates and the confidence on the estimates. Even though I have worked mostly with vision techniques, I have also applied Quasi-Bayesian methods to other sensory modalities, for example acoustic signals. In another direction, I have explored the possibility of learning convex sets of probability from data.

The JavaBayes system is a general purpose inference engine for graphical models; it can generate posterior probabilities and expectations for probabilistic models represented as directed acyclic graphs. I have derived and implemented algorithms for robustness analysis of Bayesian networks, both for local and global perturbations. These algorithms are the first ones to provide exact solutions and convergent approximations for Quasi-Bayesian networks, and have several advantages over previous efforts.

Apart from the issues above, I have worked in a variety of problems, mostly in Robotics and Artificial Intelligence; I have been particularly involved with mobile robots and their sensors.

Right after my undergraduate course, I took a Master of Engineering in Brazil, and worked in the first brazilian mobile robot, called Ariel. We produced a complete system, from the mechanical structure to the planning software; the result was very impressive and we ended up showing it off in the Jornal da Globo (Brazil's second most important TV news source). Unfortunately, that material is not online.

I have worked with several problems in computer vision, in the context of the Viper system; some years ago I produced a line linker based on the Akaike Information Criterion (AIC), which was distributed in the net. That code is probably too old to be of interest, but the algorithm using the AIC may be of value; there is a tech report that describes it. Another aspect of my work was the investigation of celestial data as a source of position estimates for mobile robots. And finally, another twist in this work was the study of atmospheric scattering as a clue for depth in outdoor environments; as far as I know, the first study of scattering in the context of image understanding.

I have been interested for some time in the problem of calculating bounds for dynamical systems; I have since discovered a huge literature in this area, which I expect to be of great relevance for robust Statistics in the future. I have published some work on the specific topic of manipulating ellipsoidal models of error in Robotics.

I worked, for two years, in the Lunar Rover project here at Carnegie Mellon, specifically with the Ratler robot. A number of papers describes the entertaining experiences we had with this robot. We actually had it rolling for some fifty kilometers in our outdoor tests. I'm still part of the Lunar Rover group, since the Viper system demonstrate technology that is used in the Atacama mission.

fgcozman@cs.cmu.edu [Send Mail?]