Newsgroups: comp.robotics
Path: brunix!sgiblab!swrinde!cs.utexas.edu!uunet!psinntp!gdstech!gdstech!wlim
From: wlim@gdstech.GRUMMAN.COM (Willie Lim)
Subject: Informal Survey (updated)   Part 2/2
Message-ID: <WLIM.93Oct28170111@gdstech.GRUMMAN.COM>
Sender: wlim@gdstech.grumman.com (Willie Lim)
Organization: Grumman Corporation, Bethpage, New York, USA.
Distribution: comp
Date: Thu, 28 Oct 1993 22:01:10 GMT
Lines: 1731

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;;                                              ;;;
;;;                                              ;;;
;;; Edited (minor) messages (detailed responses) ;;;
;;;                                              ;;;
;;;                                              ;;;
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;

From cfriedla@isx.com Thu May 21 11:53:14 1992
Date: Thu, 23 Apr 92 11:41:43 PDT
To: wlim@gdstech.grumman.com (Willie Lim)
From: cfriedla@isx.com
Subject: Re: An Informal Survey of Robot Development Environment (an RFI)

Our group works with and builds Brooks Subsumption Robots
So we all use Mac II CX or CI .


   ***********************************************************
   *                                                         *
   *    Carl Friedlander Ph.D.              (818)706-2020    *
   *    ISX Corporation                 FAX (818)706-2056    *
   *    4353 Park Terrace Drive        Page (818)592-5410    *
   *    Westlake Village, CA 91361                           *
   *                                                         *
   ***********************************************************
   



From eno@leland.stanford.edu Thu May 21 11:53:32 1992
Date: Thu, 23 Apr 92 12:07:31 PDT
From: eno@leland.stanford.edu
To: wlim@gdstech.grumman.com
Subject: Re:  An Informal Survey of Robot Development Environment (an RFI)

I'm working on a project on landmark based navigation, we use:

0)  A robot from Nomadic Tech. (based here in Palo Alto) which has three
rings of sensors i) 16 sonars ii) 16 infrared sensors iii) 16 fairly useless
bumpers..  the only processing done on board is control of motors and sense
cycles and communication via the radio-modem

1)  I'm using a mac IIci because basically that's what was available and I
think until recently, that was the only platform for which they'd written
an interface for.  Until now, processing power hasn't really been an issue.

2)  The interface and simulator are both written to run in Allegro Common
Lisp, which is nice.  The on board stuff for the robot is, as far as I know
(pardon my ignorance, but I haven't had to mess with it yet), C for the
6811s which written, compiled, and downloaded from any old PC with the
proper connections.

hope this helps, If you want, I can get more technical specs concerning
on board processing, sonar and IR, etc....

Ben


From noreils@aar.alcatel-alsthom.fr Thu May 21 11:53:44 1992
Date: Fri, 24 Apr 92 10:58:33 +0200
From: noreils@aar.alcatel-alsthom.fr ( Fabrice Noreils )
To: wlim@gdstech.grumman.com
Subject: robotics

Dear Willie:

We have two robots at Alcatel Alsthom Recherche (AAR):

The first one is a robot for indoor environment and has two motorized
wheels and is equipped with a ring of 16 ultrasonic sensors and laser
stripe range finder. This work focusses on world modeling (using
sonars), navigation (obstacle avoidance and path planning), and
mission planning.

The second one is a robot for outdoor environment and has four
tiltable tracks robot equipped with force sensors, an inertial
reference system, and a laser-stripe range finder. This robot is
either teleoperated or autonomous and thus it is able to model the
terrain (with the laser-stripe) and plan a trajectory (by modifying
the orientation of the tracks).

Both Robots software run on a VME multi-processor hardware (up to 10
boards) supporting a real-time operating system (VxWorks). 

Softwares are first developed on SPARC II workstations connected to
Ethernet ( workstations run X windows and MOTIF) and are then
downloaded into robots.

If you need more information, please do not hesitate to contact me

Cheers

- fabrice

------------------------------------------------
Fabrice R. Noreils 
Research Scientist
Alcatel Alsthom Recherche
Route de Nozay
91460 Marcoussis - France
email: 	noreils@aar.alcatel-alsthom.fr
phone:  19 - 33 - 1 64 49 17 73
FAX  :  19 - 33 - 1 64 49 06 95
------------------------------------------------


From jn163051@longs.lance.colostate.edu Thu May 21 11:53:47 1992
Date: Fri, 24 Apr 92 19:20:39 MDT
From: jn163051@longs.lance.colostate.edu
To: wlim@gdstech.grumman.com
Subject: Re: Development Environments for Mobile Robots
Newsgroups: comp.robotics
In-Reply-To: <WLIM.92Apr23135711@gdstech.GRUMMAN.COM>
Organization: Colorado State U. Engineering College
Cc: 

Six legged walking robot, legs in stretched hexagon, each leg has three DOF,
legs activated by pneumatic cylinders, no intermediate positions 
extend/retract only, eight walking directions, spin CW/CCW, limping mode
involves walking on five legs only, speed 0.4 m/s, IR sensor detects range
and direction to reflector obstacle.

Controlled by on board motorola 68HC11EVM, code generated in C on an AT.

Regards    -Joel


From kilian@palm.cray.com Thu May 21 11:53:49 1992
Date: Sat, 25 Apr 92 11:47:40 CDT
From: kilian@palm.cray.com (Alan Kilian)
To: wlim@gdstech.grumman.com
Subject: Re: Development Environments for Mobile Robots
Newsgroups: comp.robotics
In-Reply-To: <WLIM.92Apr23135711@gdstech.GRUMMAN.COM>
Organization: Cray Research, Inc.
Cc: kilian@timbuk.cray.com


I have a mobile robot with a rotating ultrasonic sensor.
It has a Motorola MC68HC16EBV Evaluation board mounted on it.
I use the Motorola Assembler and evb16 debugger on a Zeos Notebook 386 PC
  to develop the software.

 -Alan Kilian           kilian@cray.com 612.683.5499 (Work) 612.729.1652 (Home)
  Cray Research, Inc.   655 F Lone Oak Drive, Eagan  MN, 55121 


From vestli@ifr.ethz.ch Thu May 21 11:53:55 1992
Date: Sun, 26 Apr 92 12:49:00 +0200
From: Sjur Jonas Vestli <vestli@ifr.ethz.ch>
To: wlim@gdstech.grumman.com (Willie Lim)
Subject: Re: Development Environments for Mobile Robots

[stuff deleted --- wlim]

Own design, documentation available on request.

[stuff deleted --- wlim]

Macintosh, connected to robot via serial link (wireless lan
in the future when frequencies are released here), connected to setral
facilities via Appletalk.

[stuff deleted --- wlim]

Own development (on both Mac and Robot). Both based on MacMETH (Mac Modula-2
ETH). Own real time kernel for the robot.

[stuff deleted --- wlim]

Modula-2


Sjur J. Vestli
Institut fuer Robotik
Swiss Federal Institute of Technology.


From kjb@cs.brown.edu Thu May 21 11:53:59 1992
Date: Sun, 26 Apr 92 12:43:29 -0400
From: kjb@cs.brown.edu (Ken Basye)
To: wlim@gdstech.grumman.com
Subject: Dev. Env. for Mobile Robots
Reply-To: kjb@cs.brown.edu


0) We have a couple of robots built on RWI bases with Denning sonar
modules; the on-board computation is done on New Micros Inc.
68HC11-based single-board computers, which have a FORTH kernal in ROM.

1) We use SUN sparc 1's and 2's networked with ethernet (TCP/IP)

2) We edit with GNU emacs, write graphical stuff with Xlib and Motif.

3) We use FORTH on the robot, and C++ on the workstations.

-----------

We have another robot under development - here's my best guesses about
that:

0) The robot is based on a much larger RWI base (about 26 in.
diameter, 1200 W/hrs of battery power).  The on-board computer will be
a Moto. 147 - a 68030-based VME board with lots of interface,
including ethernet.  This will be running OS-9 with realtime
extensions.

1) same workstations

2) Same environment on workstations, OS-9 environment on robot

3) We'll use C or C++ on the robot instead of FORTH


From arkin@cc.gatech.edu Thu May 21 11:54:02 1992
Date: Mon, 27 Apr 92 07:32:25 -0400
From: arkin@cc.gatech.edu (Ronald Arkin)
To: wlim@gdstech.grumman.com
Subject: Re:  An Informal Survey of Robot Development Environment (an RFI)

All I have time for:

1) Denning mobile vehicles - DRV-1, MRV-II.
2) Sparcstation IPC, Decstation 5000/120, Microvax II, ethernet links
3) Standard unix environments and tools (X windows)
4) 90% C, 10%lisp.


From dstewart@IUS4.IUS.CS.CMU.EDU Thu May 21 11:54:06 1992
To: wlim@gdstech.grumman.com
Subject: Re: Development Environments for Mobile Robots
Newsgroups: comp.robotics
In-Reply-To: <WLIM.92Apr23135711@gdstech.GRUMMAN.COM>
Organization: School of Computer Science, Carnegie Mellon
Cc: 
Date: Mon, 27 Apr 92 8:41:18 EDT
From: dstewart@IUS4.IUS.CS.CMU.EDU
Sender: dstewart@IUS4.IUS.CS.CMU.EDU

[stuff deleted --- wlim]

Self-Mobile Space Manipulator (SM^2).
    (Principle Investigators: 
	Ben Brown (hbb@ri.cmu.edu) and Takeo Kanade (tk@ri.cmu.edu))

This walking robot has been designed for space station trusses, to both
travel on it, and manipulate objects (e.g. other parts of the
truss structure, satellites, or tools required by astronauts.)
 
[stuff deleted --- wlim]

VMEbus-based system, with several Ironics MC68020 and MC68030 single
board computers (SBC).  Each SBC is running the Chimera II Real-Time
Operating System (finger 'chimera@cmu.edu' for more info on it).
The system also contains ADCs, DACs, and PIOs for I/O.  The VME
system is hosted by a Sun workstation, on which development is
performed, using standard Sun tools (e.g. cc, dbx, X or Sunview, etc.)
and the Chimera II support tools.  Communication between the Sun
and VME system is through BIT3 VME-to-VME adaptor.  The robot
is tethered, hence does not carry the computer on board.

Multiple CPUs are used to get additional performance.  All interprocessor
communication is over the VME backplane and processor transparent, 
and the capabilities are provided by Chimera II.

[stuff deleted --- wlim]

C for everything.  We also use "reconfigurable systems software" 
methodology for developing modular and reusable code (finger
'chimera@cmu.edu' for references).

Dave Stewart


From aras@eceris.ece.ncsu.edu Thu May 21 11:54:10 1992
Date: Mon, 27 Apr 92 16:12:33 -0400
From: aras@eceris.ece.ncsu.edu (Caglan M. Aras)
To: wlim@gdstech.grumman.com
Subject: Re: Development Environments for Mobile Robots
Newsgroups: comp.robotics
In-Reply-To: <WLIM.92Apr23135711@gdstech.GRUMMAN.COM>
Organization: North Carolina State University
Cc: 

[stuff deleted --- wlim]

We are using a 6 processor VME system (4- 68040, 2-68020) with OS/9 and
P/NET, a distributed pipe manager for multiprocessing. We also have a
datacube frame grabber and framestore. We use the VME both for
development and for the actual control. In addition we have used the
Image program on the macintosh to process some of the images before
writing specific programs on the VME system.



-- 
Caglan M. Aras			| Robotics and Intel. Systems Lab.	
RISL- Box 7911			| aras@eceris.ece.ncsu.edu
North Carolina State University | ph: 919-515-5405
Raleigh, NC 27695		| 


From nhaas@watson.ibm.com Thu May 21 11:54:16 1992
Date: Tue, 28 Apr 92 20:33:24 EDT
From: nhaas@watson.ibm.com
To: wlim@gdstech.grumman.com
Subject: Re: comp.robotics query

[stuff deleted --- wlim]
  The short answer is that we use about 70% Symbolics, 15% RS/6000;
5% DOS machines of various flavors, and 10% Suns.  Over time, the RS6000s
will increase.  We have (as you know) 4 RWI 12" bases, and the Esched Scorbot
III arm. We have since acquired a Zebra Zero Gravity arm, specially lengthened
to our specifications.

   We use Lisp, except for Alberto, and the vision people, who use C.
Jon has lately been trying to use Microsoft Small C on one of the DOS machines.
He doesn't like its lack of debugger, or lack of decent debugger.

  On the RS6000, we use Lucid lisp, with CLOS and CLIM. It works OK for me,
but Jon does a lot of array-consing, and the garbage collector seems to have
a bug where it runs out of space eventually. (I don't see this in my work,
though.) We are still running the beta-test version they sent us last summer;
they say there'll be a new, real production version of Lucid coming in a few
weeks.  The debugger for this Lisp is OK but not great by any means.  There is
one guy around here who has Allegro; he says the debugger is really nice, but
still only about 75% of what Genera provides.  CLOS seems to have no problems
of any kind. I like CLIM a lot, especially the colors (!) and 8-bit gray
scale (Symbolics' displays have to dither to show gray), but we still have
version 1.0, which I think has some flaky (repeatable, but stupid) behavior
which I expect will be ironed out in the next release of CLIM (1.1? 2.0?).

  We of course use X-windows for everything on the AIX or UNIX machines.
Mostly use Motif (mwm); Alberto uses twm, if I remember correctly.  The
DOS machines mostly do NOT run windows, but there are 2 here that do.
(We have 286 and 386 processors; no 486's so far. All stationary. On board
the robots, only Jon's 68HC11 network, coupled by radio (ARLAN is the manu-
facturer) modems to the Symbolicses.

   Networking: Symbolics' Chaosnet, Ethernet, and PC token-ring LAN. TCP/IP,
NFS-mounting between all machines except PCs.

   We do not have VxWorks, OS-9, CONDOR, or anything like that.

   The C people on the RS6000 use something called SabreC, which they claim
is great in all respects.

    Alberto uses Khoros for image-munging, and another tool with a name like
XRS. He likes them.

   For figure document preparation, there's I-DRAW, which might be available
on both UNIX and AIX, which creates just wonderful color foils, and I gather
it's not hard to run.


   Hope this will do. Feel free to ask questions on anything I've forgotten
to address.

-Norm


From rbc@engr.ucf.edu Thu May 21 11:54:23 1992
Date: Wed, 29 Apr 92 11:38:10 EDT
From: rbc@engr.ucf.edu (Robotics Club)
To: wlim@gdstech.grumman.com
Subject: Re: Development Environments for Mobile Robots
Newsgroups: comp.robotics
In-Reply-To: <WLIM.92Apr23135711@gdstech.GRUMMAN.COM>
Organization: engineering, University of Central Florida, Orlando
Cc: 

[stuff deleted --- wlim]

6 legged walking machine, designed for the SAE Walking Machine
Decathlon. Each joint powered by a DC Brush motor, with feedback
being given by potentiometer. Digital sensors include foot on
ground, obtacle in leg path, and platform level. 

[stuff deleted --- wlim]

uh, a commodore 64. They are cheap, and we have had experience with
them. We use a custom made memory map controller using four 6522 
Versatile Interface Adapter chips, and one ADC8017 16 channel,
8 bit A/D. 

[stuff deleted --- wlim]

... We use SuperC, a C language compiler for the C64.
[stuff deleted --- wlim]

C


Note. We are moving to another robot, same configuration, we will
use a 68000 based Amiga 500, another custom controller card with
four 6522's and 2 ADC8017's, development environment is DICE,
Dillon's Integrated C Compiler, a Freeware C compiler. 

We use amiga's and CBM equipment due the the fact that interfacing
to the 68000 is very simple, and we use them as our own computers.
The Amiga has a very nice near-real time multitasking operating
system, which will run in 512K. WITH your code.


From nhaas@watson.ibm.com Thu May 21 11:54:27 1992
Date: Wed, 29 Apr 92 12:27:49 EDT
From: nhaas@watson.ibm.com
To: wlim@gdstech.grumman.com
Subject: More

   I meant to mention in my previous note that the typical way to run Lisp
under AIX is to run it in a GnuEmacs buffer. With Lucid, there is no partic-
ular synergy; with Allegro, there is the ILISP package (which I think was
written at CMU), which is a suite of GnuEmacs-lisp routines that tightly
couple the editor to the lisp, so that you can do "meta-point", for instance,
and many more good things. I implemented a couple of these sorts of things
myself to work with Lucid, but didn't achieve the same level of performance,
because I was just passing text strings to *standard-input*, and Lisp read
them whenever it felt like it. The ILISP system uses a TCPIP "socket", and
starts a Lisp process which is dedicated to reading it; this process is
asynchronous with the main lisp process or processes.


From Reid_Simmons@FREESIA.LEARNING.CS.CMU.EDU Thu May 21 11:54:30 1992
To: Willie Lim <wlim@gdstech.grumman.com>
Subject: Re: [wlim: An Informal Survey of Robot Development Environment (an RFI)] 
In-Reply-To: Your message of "Tue, 05 May 92 06:10:45 EDT."
             <9205051010.AA01323@gdstech.grumman.com> 
Date: Tue, 05 May 92 11:11:12 -0400
From: Reid Simmons <Reid_Simmons@FREESIA.LEARNING.CS.CMU.EDU>

Willie,

Sorry for the delay in responding.  I'll answer regarding my two robot
projects: the Ambler Planetary Rover and the Mobile Manipulator.
If you'd like, I could forward this on to other projects at CMU (NavLab,
HMMWV, Tessalator).

Reid

I. AMBLER: 
   0) A brief description of your robot configuration.
Ambler is a six-legged robot designed for autonomously walking across rugged
terrain, such as that found on Mars.  The Ambler features orthogonal legs
(RPP mechanisms) that decouple horizontal and vertical motions.  It has a
novel *circulating gait*, in which trailing legs recover through a central
body, past the other legs, to become the new leading leg.  The Ambler uses
a scanning laser rangefinder to provide 3D terrain maps, and six-axis force
sensors in each foot to indicate terrain contact.  The Ambler also has
incremental and absolute encoders on each joint, and two inclinometers to
indicate tilt and roll of the machine.

   1) Type of workstations (e.g. SUN, Silicon Graphics, IBM
      RS6000, HP, IBM PC/XT/AT/286/386/486, Macintosh, ...)
      used and why.  If you more than one workstation, please say how
      they are networked (e.g. ethernet with TCP/IP, Appletalk,...)
The Ambler uses 9 Creonics motion control boards for low-level servo
control.  The real-time controller is implemented using two Motorola CPU
boards that communicate via a VME backplane.  Perception and planning
software run on two on-board Sun SPARC II workstations, which communicate
with each other and the controller via thin-wire Ethernet and TCP/IP.  The
perception software communicates with the laser scanner via an S-bus to
VME-bus converter.  The various hardware choices were made mainly for
reasons of availability and compatibility with our existing development
environment.  We also make use of two simulators -- a 2D X-based simulator,
that runs on Sun workstations, and a 3D simulator running on a Silicon
Graphics IRIS workstation.

   2) The development environment used for both the workstation and
      the on-board robot  e.g. GNU software, MPW (for the
      Mac), X windows, MOTIF, VxWorks, CLIM, ...  
Operating system is standard Sun Unix for the perception and planning
software, VxWorks for the real-time controller.  User interface is done
with a combination of X and Motif.  Systems integration uses the home-grown
Task Control Architecture (TCA), a message-based system for passing data
and control information between distributed processes.  Communication
through TCA is via TCP/IP, and the architecture runs on Sun, NeXT, and RT
machines. 

   3) Languages used: C, LISP, Prolog, CLOS, ...
Development uses C exclusively.



II. Mobile Manipulator
   0) A brief description of your robot configuration.
The Mobile Manipulator project is based on a HERO 2000, which is a
2-wheeled robot with manipulator arm and gripper.  The sensors include
scanning sonar, fixed base-mounted sonar, and a pointable wrist-mounted
sonar.  There is a BW camera mounted on the robot plus a ceiling-mounted
camera for use within the laboratory.  The tasks of the Mobile Manipulator
include collecting cups from the lab floor, retrieving and delivering
printer output, recharging itself when necessary, and exploring its
surroundings.

   1) Type of workstations (e.g. SUN, Silicon Graphics, IBM
      RS6000, HP, IBM PC/XT/AT/286/386/486, Macintosh, ...)
      used and why.  If you more than one workstation, please say how
      they are networked (e.g. ethernet with TCP/IP, Appletalk,...)
The on-board processors of the HERO are Z8088s (I think).  These are
programmed to execute guarded move commands and to provide a higher-level
interface to the off-board system.  The off-board system uses standard Sun
workstations, mostly IPCs and ELCs.  We also have a speech interface to the
robot operating on a NeXT workstation.  Communications to/from the robot is
via a 9600 baud radio link.  Camera images from the robot are transmitted
via a standard Rabbit video transmitter, and digitized on a Matrox board.

   2) The development environment used for both the workstation and
      the on-board robot  e.g. GNU software, MPW (for the
      Mac), X windows, MOTIF, VxWorks, CLIM, ...  
Development is using standard Sun Unix.  User interface is via X windows
and text-based entry.  Allegro CommonLisp is used for the LISP-based
processes; the CC compiler for C-Based processes.  Systems integration of
the distributed processes is via the Task Control Architecture, which is
used to connect both the LISP and C processes.

   3) Languages used: C, LISP, Prolog, CLOS, ...
On-board software development is in HERO-Basic (sigh).  Off-board
development is in C (for the controller and perception software) and LISP
(for the planning and user-interface software).


From slack@starbase.MITRE.ORG Thu May 21 11:54:33 1992
Return-Path: <slack@starbase.mitre.org>
Date: Tue, 5 May 92 11:59:51 EDT
From: slack@starbase.MITRE.ORG (M. G. Slack)
To: wlim@gdstech.grumman.com
Cc: bonasso@starbase.mitre.org, slack@starbase.mitre.org
In-Reply-To: R. Peter Bonasso's message of Fri, 1 May 92 09:08:57 EDT <9205011308.AA22969@starbase.mitre.org>
Subject: [wlim@gdstech.grumman.com: An Informal Survey of Robot Development Environment (an RFI)]


Willie,

   I would like to do an informal survey of the development enviroments
   being used out there.  The things that I would be interested in are:

      0) A brief description of your robot configuration.

A Denning MRV-1 mobile robot platform, 
  -The base is synchro drive 
  -An optional head platform is installed with an independent turn axis
  -upper ring of 24 sonars at 15degree separation about 2.5 feet off the floor
  -Lower ring of 6 sonars ~eq spaced with baffels about 5 inches off the floor
  -Pitch and roll clinometers
  -IR Beacon reading system
  -Laser target reading system
  -Base processor is 68000 which communicates to the robot's subsystems
  -Additional 100 Amp Hr 24 volt power and necessary Dc-DC converters 
   to supply power to an on-board MacQuadra 
  
-  We also have a PRISM real time sterio vision system which can be mounted on
the robot and controlled to provide additional sensory information.

      1) Type of workstations (e.g. SUN, Silicon Graphics, IBM
	 RS6000, HP, IBM PC/XT/AT/286/386/486, Macintosh, ...)
	 used and why.  If you more than one workstation, please say how
	 they are networked (e.g. ethernet with TCP/IP, Appletalk,...)

The MacQuadra which is on-board uses RS232 ports to command and read the
sensors and actuators leaving the lowlevel control to the periphial computers.
Ethernet is used to down-load programs to the quadra which compiles them into a
runtime module.  There is also an RF modem which is used to send telemetry to
another macintosh which is located remotely and allows a user to observe
internal inforamtion of the robot as well as provides a mechanism for traded
control. 


      2) The development environment used for both the workstation and
	 the on-board robot  e.g. GNU software, MPW (for the
	 Mac), X windows, MOTIF, VxWorks, CLIM, ...  

Code is developed in a lisp environment (Mac Lisp or on a uExplorer) using the
REX/GAPPS languages which reside inside of lisp.  The result is a circuit
definition which implements the developed algorithm.  The circuit is then sent
to a backend which generates a simulation of the circuit in C, Lisp, ...
We use the C backend.  The C code is then sent to the robot where it is
compiled and run.

      3) Languages used: C, LISP, Prolog, CLOS, ...

C, C++, Lisp

   Please send your responses (keep them short, please) to me
   (wlim@gdstech.grumman.com).  When I get enough responses I'll post a
   summary to comp.robotics.

   Thanks in advance.



   Willie

From gat@robotics.Jpl.Nasa.Gov Thu May 21 11:54:36 1992
Date: Tue, 5 May 92 10:01:43 PDT
From: gat@robotics.Jpl.Nasa.Gov (Erann Gat)
To: wlim@gdstech.grumman.com
Subject: Re:  An Informal Survey of Robot Development Environment (an RFI)

Unfortunately, these are not short answers in our case.  We have at
least seven mobile robot systems and four manipulators and every one
has a different setup - everything from Suns running RCCL to 6811's
running ALFA.  Perhaps if you told me why you need to know I could focus
the answer appropriately.

E.


From Reid_Simmons@FREESIA.LEARNING.CS.CMU.EDU Thu May 21 11:54:48 1992
To: Willie Lim <wlim@gdstech.grumman.com>
Subject: Re: [wlim: An Informal Survey of Robot Development Environment (an RFI)] 
In-Reply-To: Your message of "Wed, 06 May 92 06:13:46 EDT."
             <9205061013.AA12062@gdstech.grumman.com> 
Date: Wed, 06 May 92 08:12:27 -0400
From: Reid Simmons <Reid_Simmons@FREESIA.LEARNING.CS.CMU.EDU>

[stuff deleted --- wlim]

Reid

PS In my description of the Ambler's controller, the two Motorola CPU
chips are one 68030 and one 68020.


From nivek@frc2.frc.ri.cmu.edu Thu May 21 11:54:53 1992
Date: Wed, 6 May 92 10:22:17 EDT
From: Kevin Dowling <nivek@frc2.frc.ri.cmu.edu>
To: cet@cs.cmu.edu, benny@frc2.frc.ri.cmu.edu, nivek@frc2.frc.ri.cmu.edu,
        wlim@gdstech.grumman.com
Subject: Re: [wlim: An Informal Survey of Robot Development Environment (an RFI)]
cc: benny, cet

Willie,

Most robot configurations here in Field Robotics and Vision and Autonomous Systems
are mobile systems with some advanced manipulation work going on as well
(Khosla et al).

Nearly all workstations here are Sun SPARC's with a smattering of IRIS' for
simulation and graphics although the Sun's running XGL are pretty fast.
There are a couple of Macs too but are used only for some Mac applications
and not for robot control etc. We do have special purpose hardware as well with
a number of VME based systems, Maspar parallel processor, Titan graphics machine etc.

Development evironments are UNIX usually running X11R5 and/or Openwindows environments.
VXWorks is used for real-time development on most systems and CHIMERA (home brew RTOS)
is used also for some manipulator control.

There are extensively development systems for task control and communications such
as the Task Control Architecture (TCA) and large libraries of imaging software (GIL)

Language most used is C, not for any particular religious reasons but it is
most convienent and offers HLL power and LLL flexibility. LISP is also
used to a smaller extent.

Hope this helps.

					nivek


From Chuck_Thorpe@IUS4.IUS.CS.CMU.EDU Thu May 21 11:54:55 1992
To: Willie Lim <wlim@gdstech.grumman.com>
Cc: nivek@IUS4.IUS.CS.CMU.EDU, benny@IUS4.IUS.CS.CMU.EDU
Subject: Re: [wlim: An Informal Survey of Robot Development Environment (an RFI)] 
In-Reply-To: Your message of "Wed, 06 May 92 09:39:33 EDT."
             <9205061339.AA13567@gdstech.grumman.com> 
Date: Wed, 06 May 92 17:32:57 -0400
From: Chuck_Thorpe@IUS4.IUS.CS.CMU.EDU


0)  Navlab and Navlab II mobile robots

1)  Sun-4's, for historical reasons

2)  X windows, sometimes Athena widget set

3)  C

... but all the above may change in the near future, as we look for faster
machines, look at C++, InterViews, etc.

-- Chuck


From bob@robocop.ee.washington.edu Thu May 21 11:55:08 1992
From: Robert W. Albrecht eeb537 5-1600  <bob@robocop.ee.washington.edu>
Subject: robot development environment
To: wlim@gdstech.grumman.com
Date: Wed, 13 May 92 9:26:37 PDT
Mailer: Elm [revision: 64.9]

Per your request, here is a summary of robot development environment.

0)  ROBOTS:  2 Denning mobile robots with ultrasonics, laser
reflectometer, video, tactile bumpers, infra-red beacon detection,
wheel encoders, speech synthesis, etc.

1)  WORKSTATIONS:  7 HP 9000 series 300 workstations - ethernet TCP/IP

2)  WORKSTATION DEVELOPMENT ENVIRONMENT:  Gensym - G2 real-time expert
system in off-board workstations.  radio link and/or wire line to
robots.  Locally developed multitasking operating system named LLAMA
on board the robots running on MC 68000 in a basic OS-9 operating
system.

3)  LANGUAGES:  Gensym - G2 off board with foreign functions in C.  G2
is basically written in LISP.  LLAMA on-board which is FORTH-like but
written in C.
--
Bob Albrecht


From maja@ai.mit.edu Thu May 21 11:55:11 1992
From: maja@ai.mit.edu (Maja J. Mataric)
Date: Thu, 21 May 92 10:48:11 EDT
To: wlim@gdstech.grumman.com
In-Reply-To: Willie Lim's message of Thu, 21 May 92 09:34:46 EDT <9205211334.AA01513@gdstech.grumman.com>
Subject: [wlim: Survey]



Robot configuration: 

We are working with 20 small mobile robots.  The robots are
rectangular foot-long wheeled bases equipped with piezo-electric bump
sensors around the body, and a forklift for finding, carrying, and
stacking pucks.  The forklift has a suite of 6 infra-red sensors, two
for obstacle avoiding, four in the jaw for puck positioning, and two
for alignment and puck stacking.  The system also includes two radio
stations which allow the robots to triangulate their position, and
transmit and receive 8-bit messages at 1Hz.  This system allows us to
test a variety of collective and cooperative behaviors.

Workstations:
Macintosh IIsi, Macintosh II.

Development Environment:
Macitosh Allegro Common Lisp.

Languages: 

The robots are programmed in the Behavior Language, a high-level
parallel robot programming language.  The Behavior Language compiles
into the Subsumption Architecture, which further compiles into a
variety of target assemblers including HC6811 microprocessor assembler
which is what our robots use.


From ian@ai.mit.edu Sat May 23 06:19:34 1992
From: ian@ai.mit.edu (Ian D. Horswill)
Date: Fri, 22 May 92 13:51:06 EDT
To: wlim@gdstech.grumman.com
In-Reply-To: Willie Lim's message of Thu, 21 May 92 09:34:46 EDT <9205211334.AA01513@gdstech.grumman.com>
Subject: [wlim: Survey]

Hi Willie,

[stuff deleted --- wlim]

I have an RWI base (B12) with a small VME card-cage on top.  It
contains a Pentek 4283 DSP card and a Data Translation frame grabber.
It has a small Chinon CX-101 video camera with a 3mm lens and a voice
synthesizer.  A 6811 board acts as a peripheral processor.

[stuff deleted --- wlim]

Macintosh w/Appletalk and TCP/IP.  Downloading is via the serial port.

[stuff deleted --- wlim]

My own lisp cross-compiler running under Macintosh Common Lisp v. 2.0.

[stuff deleted --- wlim]

Senselisp, a statically typed variant of Scheme with ML-style type
inference, macros, and pointer arithmetic.  The implementation does
not support GC or closures for obvious reasons.

-ian




From jar@cs.cornell.edu Tue May 26 06:30:08 1992
Date: Mon, 25 May 92 16:20:06 -0400
From: jar@cs.cornell.edu (Jonathan Rees)
To: wlim@gdstech.grumman.com
Subject: Development Environments for Mobile Robots


FYI:

The Cornell Computer Science Robotics and Vision Lab has 2 mobots
with:

    RWI 12" wheel base
    Gespak 68000 single-board computer (.5 Mby ROM, .25 Mbyte RAM)
      used for executive level control; multithreaded Scheme
      interpreter in ROM
    Custom Intel 80c196 board controlling various sensors
    IR proximity detectors and modem
    Polaroid sonar (12 transducers, with RWI control board)
    Speech, numeric keypad, LCD display, simple bumpers

Scheme programs run autonomously, but are debugged with the help of
Lucid on a workstation, attached with an RS232 tether (tether is used
only for the debugging link).

More mobots on the way: one with a custom tank-tread base, and
one with vision.

Jonathan Rees


From Liscano@IIT.NRC.CA Tue May 26 14:27:03 1992
From: Liscano@IIT.NRC.CA
Date: Tue, 26 May 1992 09:34 EST
Subject: Mobile Robot Survey
To: wlim@gdstech.grumman.com
X-Envelope-To: wlim@gdstech.grumman.com
X-Vms-To: Lim

Date	5/26/92
Subject	Mobile Robot Survey
From	Liscano
To	Lim

Subject:     Mobile Robot Survey
Organization: National Research Council of Canada.
Project: EAVE
Robot: Customized Cybermotion.
Development: MAC II's, 68020's.
Languages: C, HARMONY OS (multi-tasking & multi-processing), MacAPP.

Ramiro Liscano
Institute for Information Technology
National Research Council
Bldg M-50 Montreal Rd.
Ottawa, ONT. Canada K1A 0R6
Tel: (613) 993-6565
liscano@iit.nrc.ca




From jar@cs.cornell.edu Tue May 26 14:27:10 1992
Date: Tue, 26 May 92 10:53:15 -0400
From: jar@cs.cornell.edu (Jonathan Rees)
To: wlim@gdstech.grumman.com
Subject: Development Environments for Mobile Robots

And for what it's worth, there's a paper describing the Cornell mobot
and its development environment:

Jonathan Rees and Bruce Donald.
Program mobile robots in Scheme.
{\em Proceedings of the 1992 IEEE International Conference on
  Robotics and Automation,} pages 2681-2688.


From kak@ecn.purdue.edu Tue Jun  9 07:16:22 1992
To: wlim@gdstech.grumman.com
Cc: kak@ecn.purdue.edu
Subject: hello.....
Date: Mon, 08 Jun 92 23:01:18 -0500
From: Avi Kak <kak@ecn.purdue.edu>



 Willie,


 	Looks like you completely missed out on Purdue University in your
	comp.robotics survey of May 23, 1992.

	The mobile robot PETER in the Robot Vision Lab at Purdue University
	is one of the earliest K2A robots made by what was then known as
	the Cybermation company.

	This robot is capable of completely autonomous indoor navigation using
	model-based vision. The robot is capable of maintaining a speed of
	approximately 10 to 12 meters per minute under vision control. This
	speed is not affected much by collision avoidance functions. Collision
	avoidance is done by using ultrasonic sensors.

	Further details on how the robot does this can be found in the
	forthcoming article


	   Authors:  A. Kosaka and A. C. Kak
	   
	   Title:    Fast Vision-Guided Mobile Robot Navigation using
	             Model-based Reasoning and Prediction of Uncertainties
		     
           Journal:  Computer Vision, Graphics, and Image Processing -- IMAGE
	             UNDERSTANDING, September 1992 (To appear).
		     

		     
        Attached below are the vital statistics.


				     			Avi Kak
							
							kak@ecn.purdue.edu



Organization    Project		Robot		Development		Languages &
                		Type		HW Environment		SW Environment
=============   =============	====		==============		=============	

 Purdue         Navigation      Cybermotion     VME 68030              VxWorks, C
 University     and             K2A 
                Control
		using
		Vision 

                									
											


From kak@ecn.purdue.edu Tue Jun  9 13:54:48 1992
To: wlim@gdstech.grumman.com (Willie Lim)
Cc: kak@ecn.purdue.edu
Subject: Re: ...another thing... 
In-Reply-To: Your message of "Tue, 09 Jun 92 07:23:06 EDT."
Date: Tue, 09 Jun 92 11:52:23 -0500
From: Avi Kak <kak@ecn.purdue.edu>


  
  Willie,

  	We use a Sun4 for software development. All the software is in C.

	Up to this time, the camera images were transmitted to the SUN4
	over a video wireless link and all the vision processing was
	done off-board. The collision avoidance was done on-board by
	a 68030 supervisory processor. The supervisory processor ran a
	XINU microprocessor operating system developed by Prof. Doug
	Comer of the CS Department here.

	Last February we purchased a VxWorks operating system for the
	68030 on-board computer. Our plan this summer is to do all
	processing on board and that includes all the vision processing.

	You might also like to know that the mobile robot has a personnel
	detection system that uses a combination of thermal and ultrasonic
	signals in an on-board neural-network based procedure. For up to
	8 feet, this system has no false alarms and 100% detection accuracy.
	We are currently trying to increase the range of robust personnel
	detection, since 8 feet is simply not enough for most sponsors.
	
	Incidently, our mobile robot project is now seven years old. Being
	not too PR-oriented, we simply have not made too big a noise about
	it. We did present our work at this year's AAAI Spring Symposium.

						   Avi



						   
 P.S.: PETER stated drooling at the prospect of meeting SmartyCat.
       Perhaps a meeting could be arranged in the not too distant
       future.



From vjs@kau.vtt.fi Fri Jul 17 16:10:17 1992
Date: Tue, 30 Jun 1992 13:18:02 +0300
From: Virpi Santti <vjs@kau.vtt.fi>
To: wlim@gdstech.grumman.com
Subject: Re: An Informal Survey of Robot Development Environment ...


You asked for information about research going on among robotics about 
a month ago. I received this request only yesterday, but I hope You still
are interested.. Well, here it comes:

Akseli is the official nickname given to  the  autonomous  mobile
robot constructed in the Machine Automation Laboratory, Technical
Research Center of Finland (VTT). Akseli  has  a  self-made  base
with  a  turtle  configuration: On each side there is an indepen-
dently driven wheel, in the back and front there are free  castor
wheels. The drive wheels are controlled with HP-1100 microchips.

Akseli carries now a 386-PC running MS-DOS. In  the  near  future
the  operating  system will be substituted by LynxOS. For the vi-
sion system Akseli carries 16 solid  and  5  rotating  ultrasonic
sensors, and 6 infrared sensors. Akseli has a radio modem link to
a controlling (and monitoring) computer. The software is generat-
ed using C-programming language.

Akseli is built for testing different  kinds  of  algorithms  for
path  planning, navigation, movement control, environment percep-
tion and collision avoidance.

If You do need more information, do not hesitate to contact.

Regards, 
vjs.

******************************************************************************
Virpi J. S"antti			     Internet: Virpi.Santti@kau.vtt.fi
Research Scientist			     or,       vjs@kau.vtt.fi
Technical Research Centre of Finland	     Tel.    	+358 31 163 630
P.O. Box 192, SF-33101 Tampere, Finland	     Telefax  	+358 31 163 494
******************************************************************************



From rg@nymph.msel.unh.edu Sun Jul 19 08:34:43 1992
Date: Sat, 18 Jul 1992 17:51:23 -0400
From: rg@nymph.msel.unh.edu (Roger Gonzalez)
To: wlim@gdstech.grumman.com (Willie Lim)
In-Reply-To: wlim@gdstech.grumman.com's message of Fri, 17 Jul 1992 20:42:54 GMT
Subject: Update to mobots survey


Here is another belated addition.

The MSEL robots are underwater autonomous vehicles (AUVs).  Two are open-frame
high precision short range testbeds, one is a hydrodynamic long range long
endurance data sampling vehicle.

Our old robots run low-level code using pSOS on 68000 boards built at the lab.
High level applications run on Ironics 6U VME boards also running pSOS.  The
two levels communicate via a serial line.  Code is developed for the low level
system on a Charles River Data System Universe 68, and for the high level on
an Ironics Performer 32.  Code is burned into prom for the low level, and
stored on a WORM drive for the high level.  In the lab, code can be
dynamically downloaded to the low level via serial lines, and to the high
level via a Bit 3 VME-VME adaptor.

This setup is pretty old, and thus we are moving to....

Two VME card cages running low power Oettle+Reichler CMOS boards, connected
via ethernet.  One cage is for vehicle system software, and is 3U format.  The
other cage (optional) is for high level applications, and is 6U.  Both run
VxWorks, and communicate with their Sun Sparcstation development environments
via ethernet.

-Roger




From connolly%rabbit@cs.umass.edu Sun Jul 19 20:31:44 1992
Date: Sun, 19 Jul 92 15:57:25 -0400
From: connolly%rabbit@cs.umass.edu (Christopher Ian Connolly)
To: wlim@gdstech.grumman.com
Subject: Re: Update to mobots survey
Newsgroups: comp.robotics
In-Reply-To: <1992Jul17.204254.14564@gdstech.grumman.com>
Organization: University of Massachusetts, Amherst
Cc: 

[stuff deleted --- wlim]

Well, I thought I'd add to the list and give you a brief description
of the mobile robot work here at UMass:

We (the Laboratory for Perceptual Robotics) have a Denning platform
with a 24-transducer sonar ring - I don't know the model number
offhand.  It's due to be gutted anyway to put in a VME cage with our
own software, so I would imagine a model number isn't too important.

We use it as a testbed for reactive planning using harmonic functions,
which in turn serves as our model for motor planning in the mammalian
central nervous system.  It will eventually be equipped with a stereo
head, but for the moment only has sonar.  A DECstation 5000 is
normally used for control and planning. 

The computer vision group here also has a Denning equipped with sonar
ring and camera, which they use for (among other things) visual
navigation experiments.  In conjunction with the vision group, we are
also experimenting with real-time path planning (again using harmonic
functions) where the obstacles are visually obtained.  In this case, a
Sun sparcstation is the host.  

In both cases, we prototype in Lisp and C, and write the end code in
C.

					-CC
-- 
	-	-	-	-	-	-	-
Christopher Ian Connolly			connolly@cs.umass.edu
Laboratory for Perceptual Robotics		wa2ifi
University of Massachusetts at Amherst		Amherst, MA 01003


From dudek@mcrcim.mcgill.ca Wed Jul 22 11:06:33 1992
From: Gregory Dudek <dudek@mcrcim.mcgill.ca>
Date: Wed, 22 Jul 1992 10:27:34 EDT
In-Reply-To: Mail Delivery Subsystem's message as of 30 Jun
Alt-Path: dudek@mcrcim.mcgill.edu
Organization: McGill Research Center for Intelligent Machines (McRCIM)
X-Face: X-Mailer: Mail User's Shell (7.2.1 12/20/90)
To: wlim@gdstech.grumman.com
Subject: Survey



  Hello, I hope you're well.
  The mobile robot survey was good to see.

  I thought I might send you info on our lab here at McGill University's
Research Centre for Intelligent Machines (McRCIM).

0) ROBOTS: RWI B-12 robot with 13 sonar sensors; other robots and sensor
systems are under development.  We should have a pan-tilt video
head working with it soon.  There's also work being done on walking
robots.  (We also have a variety of non-mobile robot arms.)

1) WORKSTATIONS: Mostly SUN Sparc machines.  Networked to other
machines (SGIs, MASPAR, Datacube) via ethernet & 
TCP/IP.  Some mc68hc11 devices for low-level control functions.

2) The main development environment is based on Standard SUN UNIX
tools and some GNU stuff, all using X-windows.  Small-C on IBM PC
is used for the 68hc11s.

3) Pretty much all our work in done on C and C++.

  Regards, Greg Dudek.


From ulrich@aifh.edinburgh.ac.uk Thu Jul 23 10:01:51 1992
Date: Thu, 23 Jul 92 14:45:14 BST
From: Ulrich Nehmzow <ulrich@aifh.edinburgh.ac.uk>
Subject: Robots in Edinburgh
To: wlim@gdstech.grumman.com
Organisation: Dept. of Artificial Intelligence, Univ. of Edinburgh.

Willie,

I picked up your review of current robotics programmes today, and I
wonder whether you would be interested to include the Edinburgh robots.

At the Department of Artificial Intelligence at Edinburgh we have built
two small mobile robots to conduct experiments in the autonomous
acquisition of motor-sensory and navigational skills: the robots learn
through experience and use connectionist computing architectures to
associate sensor signals with motor actions. There are a number of
publications, for example in "From Animals to Animats", MIT Press 1991
and the proceedings of "Intelligent Autonomous Systems 2", Amsterdam
1989. 

ALDER

Alder is the first robot we built, it's base is made from Fischertechnik
(a technical kit), it has two driving motors and a caster wheel. The
controller is an ARC52, using an 8052 microprocessor. The robot has up
to eight binary sensors (tactile, revolution counter, forward motion
sensor and pushbutton switches) and one ultrasonic sensor. It is
programmed in BASIC, programs are downloaded from either a SUN or a PC.

CAIRNGORM

Cairngorm's base is also made from Fischertechnik. The robot has a
Flight 68k controller, using a 68000 processor. Programs are written in C
on a SUN, crosscompiled and then downloaded to the robot.
Cairngorm has binary sensors (tactile, revolution counter and pushbutton
switches).


The work will now continue at the Laboratory for Cognitive Neuroscience
at the Department of Psychology, here in Edinburgh. 

I hope this is usful for you, please ask if you want more information!


Ulrich.


Ulrich Nehmzow
Laboratory for Cognitive Neuroscience at the Department of Psychology
Edinburgh University
Scotland
ulrich@castle.ed.ac.uk



From murphy@baby_doe.mines.colorado.edu Wed Jan  6 17:00:18 1993
Date: Fri, 1 Jan 93 11:37:01 EST
From: murphy@baby_doe.mines.colorado.edu (DR. ROBIN R. MURPHY)
To: wlim@gdstech.grumman.com
Subject: Mobots Survey, addition




I'd like to add our robotics program to the list:

Organization: Colorado School of Mines

Project: Mobile robots

Robot type: Denning MRV-III with Sparc IPX directly mounted 

Development/HW Environment: Sparc IIs, IPXs, IBM RS/6000s

Languages and SW Enviroment: C, X11, Khoros, potential
	fields X11 visualization tool (homemade)


Other information:

Our basic research approach is a hybrid hierarchical/reactive
architecture, similar to the AuRA architecture.  Our mission is:

The Mobile Robotics/Machine Perception Laboratory
is a facility devoted to basic
and interdisciplinary research, technology
transfer, and hands-on education
in artificial
intelligence  through robotics.
Research and technology transfer efforts will concentrate
the reduction of human risk in hazardous
situations, stewardship of the environment,
and/or improvement of  the quality of life through better
manufacturing processes.

Our specific projects are on robotic navigation for
situation assessment and rescue operations (think
"mine cave-in" and Chernobyl), transportation of
hazardous materials, and space mining.

Colorado School of Mines is a special purpose university
for science, engineering, and high technology, located
in the foothills of the Rocky Mountains 20 minutes from
Denver.  We have approximately 3000 undergraduates and 1300 graduate
students and 230 faculty.  CSM offers BS, MS, and PhD degrees 
in Mathematical and Computer Sciences, as well as an 
interdisciplinary Robotics and AI minor.

Contact person:

Robin R. Murphy, Assistant Professor
Department of Mathematical and Computer Sciences

rmurphy@mines.colorado.edu


Thanks,

Robin




From konolige@ai.sri.com Thu Oct 28 07:28:53 1993
Date: Fri, 16 Jul 93 14:49:20 PDT
From: Kurt Konolige <konolige@ai.sri.com>
To: wlim@gdstech.grumman.com
In-Reply-To: Willie Lim's message of Thu, 15 Jul 93 22:29:31 EDT <9307160229.AA11901@gdstech.grumman.com>
Subject: Project Descriptions

FLAKEY

Artificial Intelligence Center, SRI International

Flakey is an autonomous mobile robot used for research in control,
planning, sensing, and acting.  Flakey's control software is built on
modules called control structures, which incorporate a set of fuzzy
rules designed to reliably achieve a particular goal in a specified
context.  Multiple control structures operate in parallel, and are
coordinated by a task-level planner.  Flakey integrates several
diverse methods of acquiring and interpreting sensor data, including
an occupancy grid for local obstacle avoidance based on sonars and
stereo vision; 2D recognition routines for landmark acquisition and
tracking; and geometrical modeling procedures for recognizing and
tracking objects.

Flakey is a homebrew base incorporating 2 golf-cart gel cells for a
total of 700 watt-hours of energy.  Typically we can run for 3 hours
without recharging, more if there is relatively little motion.  Flakey
is equipped with sonar and bumper sensors and two video cameras.
Computation is supplied by a Z80 motor controller and a Sparc10/30
processer, upgradeable to 4 processors.  The Z80 provides basic motor
control, odometry, and sonar firing; cycle time is 20ms.  The Sparc
handles everything else, including speech synthesis and recognition,
vision and sonar interpretation, planning and reactive control.  There
is a radio ethernet bridge (AGILIS) at 200kb to our Sun network for
development.  

The Sparc has a standard UNIX environment (no special realtime
software) running Lucid LISP and C.  Most of our software is written
in Lisp, including the realtime controller that operates at 10Hz.
Graphic output is with X-windows.  There is a simulator which is
heavily used for debugging.



From tyson@WPI.EDU Thu Oct 28 07:32:59 1993
From: Tyson David Sawyer <tyson@WPI.EDU>
Date: Tue, 20 Jul 1993 17:16:36 -0400
To: wlim@gdstech.grumman.com
Subject: Req for Proj Descriptions


0)
  a) James
  b) Worcester Polytechnic Inst.
  c) Develop methods for autonomous robotics
  d) Just beginning
  e) Vision and navigation

1) James is built on an RWI B12 mobile base which has an 8 bit 6MHz
   NEC 78310 uController.  It is connected by an RS-232 to an RWI
   sonar controller which has a 68HC11.  The 68HC11 acts as the
   central behaviour controller in its spare time after controlling
   the sonar.  There are 12 sonar transducers evenly spread over the
   front 180 degrees.  The B12 base is equipped with contact sensors.
   All programming was done in assembly and Small-C.

2) A Gateway 2000 type PC was used for all programming.

3) Development was done under windows, but all the compilers and
   assemblers ran under DOS.

4) Languages used: 78310 assembly, 68HC11 assembly and small-C



From masaki@ai.mit.edu Thu Oct 28 07:33:31 1993
From: masaki@ai.mit.edu (Masaki Yamamoto)
Date: Tue, 20 Jul 93 18:57:55 EDT
To: wlim@gdstech.grumman.com
Cc: masaki@ai.mit.edu
In-Reply-To: Willie Lim's message of Fri, 16 Jul 93 23:09:10 EDT <9307170309.AA29134@gdstech.grumman.com>
Subject: Request for Project Descriptions


Hi Willie,

There are two robots that I would like to make entries.

--Masaki Yamamoto 
  (Visiting scientist from Panasonic at MIT AI lab)


**********************************************************************
SUGGESTED FORMAT FOR PROJECT DESCRIPTIONS: 

   0) A brief description of your MOBILE robot project:
       a) Project/robot name
            SOZZY
       b) Institution
            MIT
       c) Goals of the research
            To show a new prototype of a domestic robot:
            1) Small and cheap robots which can be in the middle of 
               conventional service robots and hobby robots.
            2) Robust performance based on insect-like behaviors.
            3) Simulated hormone system which makes the robot behavior
               more flexible and sometimes more human-friendly.
       d) Current state
            Programing and experimenting.
       e) Future plans
            Finishing the project by writing a paper on it.

   1) Robot configuration: 
      type of sensors: IR, bump switches, dust sensor, pyro sensor
      on-board processors: 6811
      mobile base: "homemade"
	
   2) Type of workstations: Macintosh
      as Behavior Language requires it.

   3) The development environment used: Macintosh Common Lisp

   4) Languages used: Behavior Language


**********************************************************************
SUGGESTED FORMAT FOR PROJECT DESCRIPTIONS: 

   0) A brief description of your MOBILE robot project:
       a) Project/robot name
            GOPHER
       b) Institution
            MIT
       c) Goals of the research
            To show a new prototype of a office robot:
            1) Compact and cheap robot which can navigate through
               narrow space and can be safe in case it should fail.
            2) Vision based navigation using newly developed a
               compact, low-power, real time, active vision system.
       d) Current state
            Programming and experimenting.
       e) Future plans
            Preparing a paper.

   1) Robot configuration: 
      type of sensors: active vision sensor with 2 DOF, IR, tactile
                       sensor, flux gate compass
      on-board processors: 68332 bard (Vesta)
      mobile base: R2 (ISRobotics)
	
   2) Type of workstations: Macintosh, SUN
      as Behavior Language and GCC requires them.

   3) The development environment used: Macintosh Common Lisp, X windows

   4) Languages used: Behavior Language, GCC











From dudek@mcrcim.mcgill.edu Thu Oct 28 07:38:49 1993
From: Gregory Dudek <dudek@mcrcim.mcgill.edu>
Date: Wed, 21 Jul 1993 14:37:58 -0400
In-Reply-To: Willie Lim's fascinating message of 16 Jul
X-Mailer: Mail User's Shell (7.2.5 10/14/92)
To: wlim@gdstech.grumman.com (Willie Lim)
Subject: Re: Request for Project Descriptions


SUGGESTED FORMAT FOR PROJECT DESCRIPTIONS: 

    0) A brief description of your MOBILE robot project:

There are several projects.  These phrases subsume several:
  1 - autonomous exploration, sensing and map construction at
    multiple levels of abstraction.
  2 - multi-sensor active vision for mobile robotics.

    a) Project/robot name

1) The Mobile Robotics Project (contact: Dr. G. Dudek).
2) The QUADRIS project (contact: Dr. M. D. Levine).


    b) Institution

McGill Research Centre for Intelligent Machines
McGill University

    c) Goals of the research

To develop systems capable of learning about and acting in 
an a priori unknown environment.
In the short term, to develop models for the respresentation of space,
for intelligent exploration,
and for scene understanding using multiple sensors.

    d) Current state

Vision and robotics groups (including several faculty each) are 
pursuing several topics independently
on several fronts (biological vision, 3-D modelling, manipulator design, etc.)

The mobile robotics group is pursuing a combination of theoretical
issues and practical problems.  The practical testbed is a small
commercial mobile platform.

    e) Future plans

The scope of our research is currently expanding, both in terms
of research goals and hardware.  A new lab has almost been completed.
Ongoing long-term research objectives are too varied to be fairly summarized
here.

    1) Robot configuration: type of sensors (e.g. sonars, feelers,
    "vision", IR, ...), on-board processors (e.g., VME boards,
    Z80's, 486's, 6.270 boards, mini-boards, ...), mobile base
    (e.g., "homemade", Cybermotion, RWI, TRC, Denning, Nomadics,...)

Base: RWI B-12.
Sensing: Combination of sonar, video (mounted on a pan-tilt head),
 and feelers.  An active head and active laser sensing hardware is under
 development.  
On-board processing: minimal; a couple of microcontrollers and, sometimes,
 a 68000-based system.  Cost computing is performed off-board.


    2) Type of workstations (e.g., SUN, Silicon Graphics, IBM
    RS6000, HP, IBM PC/XT/AT/286/386/486, Macintosh, ...)
    used and why.  If you have more than one workstation, please say
    how they are networked (e.g. ethernet with TCP/IP, Appletalk,...)

SUNs are used as general purpose comute engines.
SGIs are used for "virtual reality" controllers (and general computation).
A Macintosh is used to provide portability and presentation design.
Everything is networked via ether using TCP/IP.
Some vision related stuff is being developed for C-40 microcontrollers.

    3) The development environment used for both the workstation and
    the on-board robot  e.g. GNU software, MPW (for the
    Mac), X windows, MOTIF, VxWorks, CLIM, ...  

The usual stuff for the workstations: X-windows, IRIS GL.

    4) Languages used: IC, LISP, Prolog, CLOS, ...

Mainly C.




From ishiguro@sys.es.osaka-u.ac.jp Thu Oct 28 07:41:43 1993
Return-Path: <ishiguro@sys.es.osaka-u.ac.jp>
Date: Mon, 26 Jul 1993 19:53:06 +0900
To: wlim@gdstech.grumman.com
From: ishiguro@sys.es.osaka-u.ac.jp
Subject: Request for Project Descriptions

Our mobile robot project

                       Hiroshi Ishiguro & Saburo Tsuji
               Department of Systems Engineering, Osaka University
                         Toyonaka, Osaka 560, Japan

0) A brief description of our mobile robot project:
        a) Robot name: SAI (Situated Artificial Intelligence)
        b) Institution: 1992 ....
        c) Goals of the research: 
                (Goal 1) Development of an autonomous mobile robot which
can                 
                recognize an unknown environment with panoramic sensing.
                (Goal 2) Development of a vision-guided mobile robot which
can 
                accommodate to a dynamically changing world with multiple
vision 
                agents (4 vision sensors).
        d) Current state: 
                (Goal 1) The robot can move at a real time in an unknown   
    
                environment only with vision sensors and build
environmental 
                models.
                (Goal 2) The robot can simultaneously observe several
events in 
                a dynamic world.
        e) Future plans
                (Goal 1) Making qualitative maps of the environment
                (Goal 2) Development of an attention control system for the

                multiple vision agents.
1) Robot configuration
        a) Type of sensors: 4 vision sensors which can be rotated
independently 
        around a common axis.
        b) On-board processors: VME board computer (CPU=68030)
        c) Mobile base: "Homemade" (two driving wheels with optical
encoders)
2) Type of workstations
        a) Workstation: SUN workstations (IPX, Spark station 2)
        b) Computer network: Ethernet with TCP/IP
3) Development environment
        a) Workstation: X windows
        b) On-board computer: VME computer debugging monitor
4) Languages used: C



From bartha@corsair.aa.wpafb.af.mil Thu Oct 28 07:47:25 1993
Date: Mon, 19 Jul 93 10:20:44 EDT
From: bartha@corsair.aa.wpafb.af.mil (Bartha)
To: wlim@gdstech.grumman.com
Subject: Re:  Request for Project Descriptions

   0) MOBILE robot project description:
       a) Project/robot name: no name yet
       b) Institution: Wright Laboratory 
       c) Goals of the research: Learning to intercept targets using drive-reinforcement
                                 neural networks.
       d) Current state: Enough hardware in place to begin development and testing.
                         Development environment still being set up.
       e) Future plans: Incrementally add new behaviors.

   1) Robot configuration: HERO 2000 robot with rotating and base sonar, light, sound,
                           temperature, and battery sensors.  A CCD camera for vision 
                           and additional ADCs for tactile sensors will also be added.
                           
   2) Type of workstation: IBM compatible 286 which will soon be replaced
      by an onboard 486 notebook computer.  The main reason an IBM PC based
      system is being used is that the existing vision processing hardware 
      and software is IBM PC and DOS based.  Low cost is another reason.

   3) The development environment: DOS and HERO BASIC

   4) Languages used: Microsoft C and Assembler


Dr. Gabor Bartha
Wright Laboratory (AAAT-1) Bldg 635
2185 Avionics Circle 
WPAFB, OH 45433-7301

FAX: 513 476 4302
Phone: 513 255 7647


From jdc@meceng.coe.neu.edu Thu Oct 28 07:49:22 1993
Date: Mon, 16 Aug 1993 15:54:51 -0400 (EDT)
From: jdc@meceng.coe.neu.edu (Jill D. Crisman)
Subject: robot project descriptions
To: wlim@gdstech.grumman.com
X-Envelope-To: wlim@gdstech.grumman.com
Content-Transfer-Encoding: 7BIT

Willie,

I'm sorry for the late response to your request for "brief"
descriptions of mobile robot projects.  I hope that we are in time to
be added to the compliation of the list.  
We currently have two mobile robot projects:
 
_______________________________________________________
The first project:

a) Our robot is called Phaeton
 
b) We are the Robotic and Vision Systems Laboratory (RVSL) at
Northeastern University.
 
c) Our long term objective is to develop autonomous, general-purpose
robotic systems by first developing general-purpose semi-autonomous
systems, then incrementally decreasing the system's dependence on
human interaction.  Our current focus is on an interactive, general-purpose,
mobile robot control architecture which is natural and intuitive for
humans to use.  Immediate goals include the discovery of necessary and
sufficient command primitives for interactive navigation by deixis
(pointing). 
 
d) We have developed an simple control architecture consisting of six
control loops and have shown that this architecture is stable.  We are
currently working on:
	a simulated system to investigate a necessary and complete set
of deictic primitives.
	implementation of deictic primitives on a robotic system
	development of a future general-purpose wheelchair robot test
platform.
 
e) The visual tracking and basic behaviors will be combined to allow
the robot to follow targets.  Then deictic primitives will be
designed/discovered to allow interactive control of the robot via its
digital camera images.  Ultimately a wheelchair with on-board power
will provide a comprehensive platform allowing a user to take the
robot for a spin without much effort.  The robot's sensory capacity
will be enhanced over time.
 
   1) Robot configuration: 
 
Robot configuration: We have a Denning MRV3 base with a ring of 24
Polaroid ultrasonic sensors and an on-board 68000 which controls the
sonar and motors.  Two pairs of CCD cameras are used at various times:
monochrome (Cognex) and color (Pulnix).  Two independent camera
controllers (Directed Perception) provide pan and tilt.  A rotating
platform will be built to track the wheels' direction since the
Denning's synchronous steering results in the body never turning.
Vision processing is provided by a monochrome Cognex 4400 and color
Datacube boards (Digicolor and Framestore); both are VME-based.  
 
   2) Type of workstations:
 
The primary processor is an off-board Sun 4/330 connected by
serial-line tether to the Denning.  Some simulations are done on
Macintosh computers.
 
   3) The development environment:
 
Development environment: On the Sun, X and emacs provide a basic
environment.  The Cognex is programmed via a PC-based front-end using
epsilon.
 
   4) Languages used:  C on the Suns, Macintosh and Cognex.
 
_______________________________________________________
The second project:
 
a) Our robot is called the Lobster Robot
 
b) We are the Robotic and Vision Systems Laboratory (RVSL) at
Northeastern University.
 
c) Our long term objective is to develop a shallow water walking robot
based on biological models of lobsters.  This is particularly
challenging due to the surge and surf action of shallow water.
 
d) We have developed a biologically based control architecture and
have shown in simulation that this architecture produces neural
patterns observed in walking lobsters.  We are currently working on:
	a simulation to investigate kinematics and dynamics of an
eight-legged underwater vehicle,
	implementation of the central pattern generator on a single
legged robot, and 
	development of our future 8-legged underwater walking machine
 
e) Our future work includes the implementation of our biologically
based control architecture on an eight-legged underwater platform.  We
are also studying lobster behaviors which allow them to ambulate in
hydrodynamic environments and integrating these behaviors into our
current control architecture.
 
   1) Robot configuration: 
 
We are currently designing our eight-legged robot system.  At present, we are
using a single legged system which we have designed and constructed.  The leg
has 3 DOF, each driven by DC motors under PWM control.  The leg has an HC11
based single-board computer that controls its joints.
 
   2) Type of workstations:
 
Simulations are done using Macintosh computers.
 
   3) The development environment:
 
We are currently programming our single-legged robot in assembler using
the environment provided with the single-board controller.  We are
currently investigating which development environment we will use for
our eventual system.
 
   4) Languages used:  C, Pascal, and Assembly.
 
 




