Date: Tue, 26 Nov 1996 00:01:31 GMT
Server: Apache/1.2-dev
Connection: close
Content-Type: text/html
Last-Modified: Tue, 03 Sep 1996 16:28:10 GMT
ETag: "5d872-3cd8-322c5c9a"
Content-Length: 15576
Accept-Ranges: bytes
Vision and Touch Guided Manipulation
Vision and Touch Guided Manipulation Group
MIT Artificial Intelligence Lab & Nonlinear Systems Lab
The Vision and Touch Guided Manipulation group at the MIT Artificial Intelligence Lab
conducts research in a wide variety of topics related to manipulator
and end effector design, dextrous manipulation, adaptive nonlinear
control, and vision guided manipulation. We employ techniques from
various fields including Mechanical Design, Stability Theory, Machine
Learning, Approximation Theory, and Computer Vision.
The group is headed by Dr. Kenneth
Salisbury (mechanics) and Professor
Jean-Jacques E. Slotine (autonomy and vision). Other groups at
the MIT AI Lab headed by Ken are the Haptic
Interfaces Group and the Robot
Hands Group. Professor Slotine also heads the Nonlinear Systems Laboratory.
The people in and associated with the Vision and Touch Guided
Manipulation Group are:
- Brian Anthony (touch sensing)
- Mark Cannon (wavelet networks ,graduated)
- Brian
Eberman (system integration, ,graduated)
- Brian
Hoffman (active vision)
- W. Jesse Hong
(coordination vision-manipulation)
- Akhil
Madhani (wrist-hand mechanism)
- Günter Niemeyer (adaptive control and system integration)
- Daniel Theobald (visual processing)
- Ichiro Watanabe (machine learning)
[Introduction]
[Our Robots]
[Our Research]
[References]
Introduction to our Robots
The Whole Arm
Manipulator |
The MIT Whole Arm Manipulator (WAM) Arm is a very fast, force
controllable robot arm designed in Dr. Salisbury's group at the AI
Lab. The concept of "Whole Arm Manipulation" was originally
aimed at enabling robots to use all of their surfaces to manipulate
and perceive objects in the environment. Central to this concept (and
our group's design efforts in general) has been a focus on controlling
the forces of interaction between robots and the environment.
To permit this, the WAM arm employs novel cable transmissions which are
stiff, low friction and backdrivable. This in turn, permits a
lightweight design. To achieve good bandwidth in force control while
in contact with the environment, the arm's design maximizes the lowest
resonant frequency of the system and employs an impedance matching
ratio between motor and arm masses. This also enables the arm to
achieve high accelerations while moving in free space.
|
 |
Prof. Slotine and his students have developed system architectures
and control algorithms for both force controlled tasks and tasks
requiring rapid and accurate free space motion. The algorithms also
provide fast and stable adaptation of the arm to large variations in
loads and environments.
|
The Talon |
 |
A new wrist-hand mechanism has been developed and replaces a previous
forearm mounted system. The new wrist-hand, known as the Talon,
provides 3 additional powered freedoms: one for grasping forces and
two for orientation. The motors for the device are located in the
forearm to minimize end-effector mass and maximize its workspace. The
grasping mechanism is comprised of a group of 2 fingers which move
against a group of 3 fingers such that two groups may be made to mesh
together while encircling objects. Finger inner surfaces are serrated
to provide for high contact friction against rough (rock) surfaces,
and curved to enhance capturing large and small objects. Fingers may
deflect compliantly to accomodate to object geometry, and finger
deflections may be sensed to provide for monitoring grasp state. We
also have studied the design of a miniature end-effector suitable for
grasping small rocks and cylindrical objects. Similar in spirit to
the Talon, the new miniature end-effector utilizes slightly different
kinematics to enlarge its feasible grasping volume.
|
The Fast Eye Gimbals |
A more recent component of our system is our active vision system
which is comprised of two hi-resolution color CCD cameras with 50mm
focal length lenses mounted on two degree of freedom gimbals. We have
utilized cameras with a narrow field of view to give higher resolution
images of typical objects. This implies, however, that the cameras
have to be actuated in order to pan and tilt so that they can cover
broad scenes, leading to an active vision system, and an associated
trade-off between controller precision and image resolution
(narrowness of field of view).
|
 |
The actuators which we have
implemented were designed in our lab and are known as the Fast Eye
Gimbals (FEGs). The FEGs provide directional positioning for our
cameras using a similar drive mechanism as the WAM. The two joints
are cable driven and have ranges of motion of +/- 90 degrees and +/-
45 degrees in the base and upper joint axes respectively. These two
FEGs are currently strategically mounted on ceiling rafters with a
wide baseline for higher position accuracy using stereo vision
methods. The independent nature of the FEGs allow us to position each
one at different locations in order to vary the baseline or
orientation of the coordinate frame as well as easily add additional
cameras to provide additional perspectives.
|
[Introduction]
[Our Robots]
[Our Research]
[References]
Research Projects
Robust Grasping in Unstructured
Environments |
 |
One of our current projects, funded by NASA/JPL, is to develop a
fundamental understanding of the problem of combining real time vision
and touch sensor data with robot control, to yield robust, autonomous
and semi-autonomous grasping and grasp-stabilization. The research
is focused on providing conceptual and experimental support of planned
and on-going NASA missions utilizing earth-orbiting and planetary
surface robotics.
We have implemented a high-speed active vision system, a
multi-processor operating system, and basic algorithms for acquisition and
grasp of stationary spherical and cylindrical objects using coordinated
robotic vision, touch sensing, and control. Preliminary experiments on the
tracking of moving objects have also been completed. Concurrently,
research into an integrated wrist-hand design used for
performing sensor guided grasps, and a preliminary design for a
next-generation miniature end-effector are being completed.
|
Robotic Catching of
Free Flying Objects |
Another direction of our research, funded by Fujitsu, Furukawa, and
the Sloan Foundation, is to accomplish real-time robust
catching of free flying objects. We are currently focusing on
spherical balls of various sizes. We are also
experimenting with additional objects with different dynamic
characteristics such as sponge balls, cylindrical cans, and paper
airplanes.
|
 |
Our system uses low cost vision processing hardware for simple
information extraction. Each camera signal is processed independently
on vision boards designed by other members of the MIT AI Laboratory
(the Cognachrome Vision Tracking
System). These vision boards provide us with the center
of area, major axis, number of pixels, and aspect ratio of the color
keyed image. The two Fast Eye Gimbals allow us to locate and track
fast randomly moving objects using "Kalman-like" filtering methods
assuming no fixed model for the behavior of the motion. Independent
of the tracking algorithms, we use least squares techniques to fit
polynomial curves to prior object location data to determine the
future path. With this knowledge in hand, we can calculate a path for
the WAM to match trajectories with the object to accomplish catching
and smooth object/WAM post-catching deceleration.
In addition to the basic least squares techniques for path prediction,
we study experimentally nonlinear estimation algorithms to give "long term"
real-time prediction of the path of moving objects, with the goal of robust
acquisition. The algorithms are based on stable on-line construction of
approximation networks composed of state space basis functions localized in
both space and spatial frequency. As a initial step, we have studied the
network's performance in predicting the path of light objects thrown in
air. Further application may include motion prediction of objects rolling,
bouncing, or breaking up on rough terrains.
Some recent successful results for the application of this network
have been obtain in catching of sponge balls and even paper airplanes!
|
 Click to view WAM
catching.
|
 Click to view WAM Airplane
catching.
|
© 1995 Photo courtesy of Hank Morgan
|
[Introduction]
[Our Robots]
[Our Research]
[References]
Partial List of References
Autonomous Rock Acquisition,
D.A. Theobald, W.J. Hong, A. Madhani, B. Hoffman, G. Niemeyer, L.
Cadapan, J.J.-E. Slotine, J.K. Salisbury, Proceedings of the AIAA
Forum on Advanced Development in Space Robotics, Madison, Wisconsin,
August 1-2, 1996.
Experiments in Hand-Eye
Coordination Using Active Vision, W. Hong and J.J.E. Slotine,
Proceedings of the Fourth International Symposium on Experimental
Robotics, ISER'95, Stanford, California, June 30-July 2, 1995.
Robotic Catching and
Manipulation Using Active Vision, W. Hong, M.S. Thesis,
Department of Mechanical Engineering, MIT, September 1995.
Space-Frequency Localized Basis
Function Networks for Nonlinear System Estimation and Control,
M. Cannon and J.J.E. Slotine, Neurocomputing, 9(3), 1995.
Adaptive Visual Tracking and
Gaussian Network Algorithms for Robotic Catching, H. Kimura
and J.J.E. Slotine, DSC-Vol. 43, Advances in Robust and Nonlinear
Control Systems, Winter Annual Meeting of the ASME, Anaheim, CA, pp.
67-74, November 1992.
Experiments in Robotic
Catching, B.M. Hove and J.J.E. Slotine,
Proceedings of the 1991 American Control Conference, Vol. 1, Boston, MA,
pp. 380-385, June 1991.
Performance in Adaptive
Manipulator Control, G. Niemeyer and J.J.E. Slotine,
International Journal of Robotics Research 10(2), December, 1988.
Preliminary Design of a Whole-Arm Manipulation System (WAM),
J.K. Salisbury, W.T. Townsend, B.S. Eberman, D.M. DiPietro,
Proceedings 1988 IEEE International Conference on Robotics and
Automation, Philadelphia, PA, April 1988.
The Effect of Transmission Design on Force-Controlled Manipulator
Performance, W.T. Townsend, PhD Thesis, Department of Mechanical
Engineering, MIT, April 1988. See also MIT AI Lab Technical
Report 1054.
Whole Arm Manipulation, J.K. Salisbury, Proceedings
4th International Symposium on Robotics Research, Santa Cruz, CA, August,
1987.
Design and Control of a
Two-Axis Gimbal System for Use in Active Vision, N. Swarup,
S.B. Thesis, Dept. of Mechanical Engineering, MIT, Cambridge, MA,
1993.
A High Speed Low-Latency Portable
Vision Sensing System, A. Wright, SPIE, September 1993.
[Introduction]
[Our Robots]
[Our Research]
[References]
Maintainer: jesse@ai.mit.edu,
Comments to: wam@ai.mit.edu
Last Updated: Mon Aug 26 15:18:36 EDT 1996, jesse@ai.mit.edu
© 1996, All rights reserved.