While easily modified, the present implementation of Altaira is tied quite closely to the Zambonibug robot[1] and its complement of sensors and motors, shown in Figure 1.
The Zambonibug is a LEGO robot controlled by a Handy Board single-board computer, provided by the VPC committee to the competitors. The robot has two touch sensors and five reflective infrared sensors. The touch sensors are in the front of the robot, and are used to detect obstacles. The layout of the five infrared sensors corresponds to tape marks on the road tiles navigated by the robot: one center and two inboard sensors (used to follow a black line on the tiles), and two outboard sensors (used to detect tape marks encoding intersection tile types). The four tile types are shown in Figure 2.
There are two motors, providing power to the left and right drive wheels. Directional control is provided by turning the motors on and off individually. It was necessary to modify the physical configuration of the Zambonibug slightly; a 2:1 gear reduction was needed to slow the robot down and increase the motor torque.
The Zambonibug is controlled by a program running on a host computer, communicating with the Handy Board by a serial communications link. A communication protocol was developed by the VPC committee and provided to competitors; however, as the communication between the host and the Handy Board was found to be the limiting factor on the rate of rule firings, this protocol was replaced by a more compact protocol implemented in an interpreted C dialect[18].
Altaira provides both an interface to the actual modified Zambonibug
robot, and a simulated robot environment shown in
Figure 3.
The
simulated environment provides the user with a course editing window
Figure 3a.
and a robot placement window
Figure 3b.
When in
operation, the simulated robot is moved in the placement window, and
simulated sensors return brightness data sampled from scanned LEGO
tile images.