next up previous
Next: Learning a Low-level Skill Up: A Layered Approach to Previous: Related Work

The Simulator


Extensive experimentation of the type described in this article is not feasible with physical robotic systems. Consequently, to conduct meaningful research in simulation that might apply to the real world, a well-designed simulator is needed. Though not directly based upon any single robotic system, Noda's Soccer Server [12], pictured in Figure 3, captures enough real-world complexities to be an indispensable tool. This simulator is realistic in many ways: (i) the players' vision is limited; (ii) the players can communicate by posting to a blackboard that is visible to all players; (iii) all players are controlled by separate processes; (iv) each player has 10 teammates and 11 opponents; (v) each player has limited stamina; (vi) actions and sensors are noisy; and (vii) play occurs in real time. The simulator, acting as a server, provides a domain and supports users who wish to build their own agents (clients).

Figure 3: The Soccer Server system

Figure 4 illustrates the format of the communication between the server and a specific client.

Figure 4: A trace of the simulator's input and output to the client controlling player 1 (indicated ``CLIENT''). The player moves to the ball and then shoots it towards the goal. Commands from the player are indicated with ``**-> '' preceding them. Dashes are followed by a power (they are always in the direction that the player is facing), turns are followed by an angle, and kicks are followed by a power and an angle. Sensory information from the server comes in the form of audial and visual strings. In both cases, the number after the type indicator (``hear'' or ``see'') indicates the elapsed time in the match. Audial information then indicates whether it is the referee speaking or else from what angle on the sound came. Visual information includes the distance followed by angle of the visible object.

Since the client's vision is limited to tex2html_wrap_inline656 on either side, not all objects are visible at each sensory step. For example, at the beginning of the trace in Figure 4, the client sees two teammates and one opponent (player brazil 1). However after dashing once, it is no longer able to see the opponent. Similarly, after dashing a second time, it is no longer able to see the center of the penalty area: (flag p r c). After kicking the ball, when the client turns to face the goal, the opponent comes back into view.

The method of communication is illustrated by the message from teammate number 2 that is heard at time 144 (``shoot the ball''), and by the spoken response ``shooting now.'' Two messages from the referee, indicating the successful goal and the subsequent restart, are also present at the end of the trace.

Both the sensors and the actions in the simulator are noisy. Notice that even though the player begins by facing directly at the stationary ball (the ball's angle is 0) and dashes straight toward it, the ball does not remain directly in front of the player. Before its next two dashes, the client turns to face the ball again. Although not apparent from this trace, when players are far enough away, their uniform numbers, or even their team, may not be visible.

The trace in Figure 4 begins at elapsed time 124 and continues through 150. Since each time increment occurs in 0.1 second of real time, visual sensor information arrives twice a second, and the entire trace, from the moment pictured in Figure 4 until the ball enters the goal, occurs in about 2.5 seconds. Thus, the action in the simulator occurs in real time.

All of these simulator features combine to make it a very challenging and realistic environment in which to conduct research. The lessons we learn in the simulator will help us greatly as we implement the strategic levels of our real robotic system.

next up previous
Next: Learning a Low-level Skill Up: A Layered Approach to Previous: Related Work

Peter Stone
Mon Mar 31 12:26:29 EST 1997