A debugging tool I built and used for the two-wheeled robots in the Robotics Programming module at UCL. The tool visualises sensor readings, as well as the position and orientation of the robot (estimated by the program running on the robot).
For the module, we had to program the two-wheeled robots to solve a fixed-size maze. The final task was to write a program that traverses the maze and then goes from start to finish in the quickest time.
There were 9 sensors on the robot: 4 infrared distance sensors (2 on front and 2 on side), 1 ultrasound distance sensor (in the front middle), 2 bumper switches to detect when the robot hits something and 2 wheel encoders measuring the distance travelled by each wheel.
Each type of sensor had its own characteristics and errors. To create a robust program, we needed to understand these errors to some degree and know the limitations of the sensors. This is one of the problems this tool tried to solve; the other was to allow rewinding and playback to understand the behaviour of the program by seeing its inputs (sensor values) and outputs (wheel voltages).
What the tool does
The tool connects to a robot through WiFi and listens for events. The program on the robot sends regular updates of its state and sensor readings. These are shown live on the visualisation and stored in a database file, so that they can be inspected even after the robot disconnects.
This project was inspired by Bret Victor's Seeing Spaces talk. Bret mentions that in order to understand the behaviour of a robot, we need to "get inside the robot's head and see what it's seeing, and see what it's thinking". This tool only focuses on "what the robot sees" part, but arguably the other is more exciting—how could we represent a program's behaviour?
Also see my maze solving program which won 1st place in the final competition.