OpenADR: Data Visualization

As the navigational algorithms used by the robot have gotten more complex, I’ve noticed several occasions where I don’t understand the logical paths the robot is taking.  Since the design is only going to get more complex from here on out, I decided to implement real time data visualization to allow me to see what the robot “sees” and view what actions it’s taking in response.  With the Raspberry Pi now on board, it’s incredibly simple to implement a web page that hosts this information in an easy to use format.  The code required to enable data visualization is broken up into four parts; the firmware part to send the data, the serial server to read the data from the Arduino, the web server to host the data visualization web page, and the actual web page that displays the data.

Firmware

The firmware level was the simplest part.  I only added print statements to the code when the robot changes speed or reads the ultrasonic sensors.  It prints out key value pairs over the USB serial port to the Raspberry Pi.

Serial Server

The Raspberry Pi then runs a serial server which processes these key value pairs and converts them into a dictionary representing the robot’s left motor and right motor speeds and the obstacles viewed by each ultrasonic sensor.  This dictionary is then converted into the JSON format and written out to a file.

This is an example of what the resulting JSON data looks like.

Web Server

Using a Python HTTP server example, the Raspberry Pi runs a web server which hosts the generated JSON file and the web page that’s used for viewing the data.

Web Page

 

The visualization web page uses an HTML canvas and JavaScript to display the data presented in the JSON file.  The velocity and direction of the robot is shown by a curved vector, the length representing the speed of the robot and the curvature representing the radius of the robot’s turn.  The obstacles seen by the five ultrasonic sensors are represented on the canvas by five points.

DataImage

The picture above is the resulting web page without any JSON data.

Here’s a video of the robot running alongside a video of the data that’s displayed.  As you can see, the data visualization isn’t exactly real time.  Due to the communication delays over the network some of the robot’s decisions and sensor readings get lost.

Further Steps

Having the main structure of the data visualization set up provides a powerful interface to the robot that can be used to easily add more functionality in the future.  I’d like to add more data items as I add more sensors to the robot such as the floor color sensor, an accelerometer, a compass, etc.  At some point I also plan on creating a way to pass data to the robot from the web page, providing an interface to control the robot remotely.

I’d also like to improve the existing data visualization interface to minimize the amount of data lost due to network and processing delays.  One way I could do this would be by getting rid of the JSON file and streaming the data directly to the web page, getting rid of unnecessary file IO delays.  The Arduino also prints out the ultrasonic sensor data one sensor at a time, requiring the serial server to read five lines from the serial port to get the reading of all the sensors.  Compressing all sensor data to a single line would also help improve the speed.

Another useful feature would be to record the sensor data rather than just having it be visible from the web page.  The ability to store all of the written JSON files and replay the run from the robot’s perspective would make reviewing the data multiple times possible.

Advertisements