OpenADR: Data Visualization

As the navigational algorithms used by the robot have gotten more complex, I’ve noticed several occasions where I don’t understand the logical paths the robot is taking.  Since the design is only going to get more complex from here on out, I decided to implement real time data visualization to allow me to see what the robot “sees” and view what actions it’s taking in response.  With the Raspberry Pi now on board, it’s incredibly simple to implement a web page that hosts this information in an easy to use format.  The code required to enable data visualization is broken up into four parts; the firmware part to send the data, the serial server to read the data from the Arduino, the web server to host the data visualization web page, and the actual web page that displays the data.

Firmware

// Prints "velocity=-255,255"
Serial.print("velocity=");
Serial.print(-FULL_SPEED);
Serial.print(",");
Serial.println(FULL_SPEED);
// Prints "point=90,25" which means 25cm at the angle of 90 degrees
Serial.print("point=");
Serial.print(AngleMap[i]);
Serial.print(",");
Serial.println(Distances[i]);

view raw
DataPrint.ino
hosted with ❤ by GitHub

The firmware level was the simplest part.  I only added print statements to the code when the robot changes speed or reads the ultrasonic sensors.  It prints out key value pairs over the USB serial port to the Raspberry Pi.

Serial Server

from PIL import Image, ImageDraw
import math
import time
import random
import serial
import json
# The scaling of the image is 1cm:1px
# JSON file to output to
jsonFile = "data.json"
# The graphical center of the robot in the image
centerPoint = (415, 415)
# Width of the robot in cm/px
robotWidth = 30
# Radius of the wheels on the robot
wheelRadius = 12.8
# Minimum sensing distance
minSenseDistance = 2
# Maximum sensing distance
maxSenseDistance = 400
# The distance from the robot to display the turn vector at
turnVectorDistance = 5
# Initialize global data variables
points = {}
velocityVector = [0, 0]
# Serial port to use
serialPort = "/dev/ttyACM0"
robotData = {}
ser = serial.Serial(serialPort, 115200)
# Parses a serial line from the robot and extracts the
# relevant information
def parseLine(line):
status = line.split('=')
statusType = status[0]
# Parse the obstacle location
if statusType == "point":
coordinates = status[1].split(',')
points[int(coordinates[0])] = int(coordinates[1])
# Parse the velocity of the robot (x, y)
elif statusType == "velocity":
velocities = status[1].split(',')
velocityVector[0] = int(velocities[0])
velocityVector[1] = int(velocities[1])
def main():
# Three possible test print files to simulate the serial link
# to the robot
#testPrint = open("TEST_straight.txt")
#testPrint = open("TEST_tightTurn.txt")
#testPrint = open("TEST_looseTurn.txt")
time.sleep(1)
ser.write('1');
#for line in testPrint:
while True:
# Flush the input so we get the latest data and don't fall behind
#ser.flushInput()
line = ser.readline()
parseLine(line.strip())
robotData["points"] = points
robotData["velocities"] = velocityVector
#print json.dumps(robotData)
jsonFilePointer = open(jsonFile, 'w')
jsonFilePointer.write(json.dumps(robotData))
jsonFilePointer.close()
#print points
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print "Quitting"
ser.write('0');

view raw
drawPoints.py
hosted with ❤ by GitHub

The Raspberry Pi then runs a serial server which processes these key value pairs and converts them into a dictionary representing the robot’s left motor and right motor speeds and the obstacles viewed by each ultrasonic sensor.  This dictionary is then converted into the JSON format and written out to a file.

{"velocities": [-255, -255], "points": {"0": 10, "90": 200, "180": 15, "45": 15, "135": 413}}

view raw
data.json
hosted with ❤ by GitHub

This is an example of what the resulting JSON data looks like.

Web Server

import SimpleHTTPServer
import SocketServer
PORT = 80
Handler = SimpleHTTPServer.SimpleHTTPRequestHandler
httpd = SocketServer.TCPServer(("", PORT), Handler)
print "serving at port", PORT
httpd.serve_forever()

view raw
webserver.py
hosted with ❤ by GitHub

Using a Python HTTP server example, the Raspberry Pi runs a web server which hosts the generated JSON file and the web page that’s used for viewing the data.

Web Page

<!doctype html>
<html>
<head>
<script type="text/javascript" src="http://code.jquery.com/jquery.min.js"></script>
<style>
body{ background-color: white; }
canvas{border:1px solid black;}
</style>
<script>
$(function()
{
var myTimer = setInterval(loop, 1000);
var xhr = new XMLHttpRequest();
var c = document.getElementById("displayCanvas");
var ctx = c.getContext("2d");
// The graphical center of the robot in the image
var centerPoint = [415, 415];
// Width of the robot in cm/px
var robotWidth = 30;
// Radius of the wheels on the robot
wheelRadius = 12.8;
// Minimum sensing distance
var minSenseDistance = 2;
// Maximum sensing distance
var maxSenseDistance = 400;
// The distance from the robot to display the turn vector at
var turnVectorDistance = 5
// Update the image
function loop()
{
ctx.clearRect(0, 0, c.width, c.height);
// Set global stroke properties
ctx.lineWidth = 1;
// Draw the robot
ctx.strokeStyle="#000000";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], (robotWidth / 2), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's minimum sensing distance as a circle
ctx.strokeStyle="#00FF00";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + minSenseDistance), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's maximum sensing distance as a circle
ctx.strokeStyle="#FFA500";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + maxSenseDistance), 0, 2*Math.PI);
ctx.stroke();
xhr.onreadystatechange = processJSON;
xhr.open("GET", "data.json?" + new Date().getTime(), true);
xhr.send();
//var robotData = JSON.parse(text);
}
function processJSON()
{
if (xhr.readyState == 4)
{
var robotData = JSON.parse(xhr.responseText);
drawVectors(robotData);
drawPoints(robotData);
}
}
// Calculate the turn radius of the robot from the two velocities
function calculateTurnRadius(leftVector, rightVector)
{
var slope = ((rightVector leftVector) / 10.0) / (2.0 * wheelRadius);
var yOffset = ((Math.max(leftVector, rightVector) + Math.min(leftVector, rightVector)) / 10.0) / 2.0;
var xOffset = 0;
if (slope != 0)
{
xOffset = Math.round((yOffset) / slope);
}
return Math.abs(xOffset);
}
// Calculate the angle required to display a turn vector with
// a length that matched the magnitude of the motion
function calculateTurnLength(turnRadius, vectorMagnitude)
{
var circumference = 2.0 * Math.PI * turnRadius;
return Math.abs((vectorMagnitude / circumference) * (2 * Math.PI));
}
function drawVectors(robotData)
{
leftVector = robotData.velocities[0];
rightVector = robotData.velocities[1];
// Calculate the magnitude of the velocity vector in pixels
// The 2.5 dividend was determined arbitrarily to provide an
// easy to read line
var vectorMagnitude = (((leftVector + rightVector) / 2.0) / 2.5);
ctx.strokeStyle="#FF00FF";
ctx.beginPath();
if (leftVector == rightVector)
{
var vectorEndY = centerPoint[1] vectorMagnitude;
ctx.moveTo(centerPoint[0], centerPoint[1]);
ctx.lineTo(centerPoint[0], vectorEndY);
}
else
{
var turnRadius = calculateTurnRadius(leftVector, rightVector);
if (turnRadius == 0)
{
var outsideRadius = turnVectorDistance + (robotWidth / 2.0);
var rotationMagnitude = (((Math.abs(leftVector) + Math.abs(rightVector)) / 2.0) / 2.5);
var turnAngle = calculateTurnLength(outsideRadius, rotationMagnitude);
if (leftVector < rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI) turnAngle, (1.5 * Math.PI));
}
if (leftVector > rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI), (1.5 * Math.PI) + turnAngle);
}
}
else
{
var turnAngle = 0;
if (vectorMagnitude != 0)
{
turnAngle = calculateTurnLength(turnRadius, vectorMagnitude);
}
// Turning forward and left
if ((leftVector < rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, turnAngle, 0);
}
// Turning backwards and left
else if ((leftVector > rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, 0, turnAngle);
}
// Turning forwards and right
else if ((leftVector > rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI, Math.PI + turnAngle);
}
// Turning backwards and right
else if ((leftVector < rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI turnAngle, Math.PI);
}
}
}
ctx.stroke();
}
function drawPoints(robotData)
{
for (var key in robotData.points)
{
var angle = Number(key);
var rDistance = robotData.points[angle];
var cosValue = Math.cos(angle * (Math.PI / 180.0));
var sinValue = Math.sin(angle * (Math.PI / 180.0));
var xOffset = ((robotWidth / 2) + rDistance) * cosValue;
var yOffset = ((robotWidth / 2) + rDistance) * sinValue;
var xDist = centerPoint[0] + Math.round(xOffset);
var yDist = centerPoint[1] Math.round(yOffset);
ctx.fillStyle = "#FF0000";
ctx.fillRect((xDist 1), (yDist 1), 2, 2);
}
}
});
function changeNavigationType()
{
var navSelect = document.getElementById("Navigation Type");
console.log(navSelect.value);
var http = new XMLHttpRequest();
var url = "http://ee-kelliott:8000/&quot;;
var params = "navType=" + navSelect.value;
http.open("GET", url, true);
http.onreadystatechange = function()
{
//Call a function when the state changes.
if(http.readyState == 4 && http.status == 200)
{
alert(http.responseText);
}
}
http.send(params);
}
</script>
</head>
<body>
<div>
<canvas id="displayCanvas" width="830" height="450"></canvas>
</div>
<select id="Navigation Type" onchange="changeNavigationType()">
<option value="None">None</option>
<option value="Random">Random</option>
<option value="Grid">Grid</option>
<option value="Manual">Manual</option>
</select>
</body>
</html>

view raw
index.html
hosted with ❤ by GitHub

 

The visualization web page uses an HTML canvas and JavaScript to display the data presented in the JSON file.  The velocity and direction of the robot is shown by a curved vector, the length representing the speed of the robot and the curvature representing the radius of the robot’s turn.  The obstacles seen by the five ultrasonic sensors are represented on the canvas by five points.

DataImage

The picture above is the resulting web page without any JSON data.

Here’s a video of the robot running alongside a video of the data that’s displayed.  As you can see, the data visualization isn’t exactly real time.  Due to the communication delays over the network some of the robot’s decisions and sensor readings get lost.

Further Steps

Having the main structure of the data visualization set up provides a powerful interface to the robot that can be used to easily add more functionality in the future.  I’d like to add more data items as I add more sensors to the robot such as the floor color sensor, an accelerometer, a compass, etc.  At some point I also plan on creating a way to pass data to the robot from the web page, providing an interface to control the robot remotely.

I’d also like to improve the existing data visualization interface to minimize the amount of data lost due to network and processing delays.  One way I could do this would be by getting rid of the JSON file and streaming the data directly to the web page, getting rid of unnecessary file IO delays.  The Arduino also prints out the ultrasonic sensor data one sensor at a time, requiring the serial server to read five lines from the serial port to get the reading of all the sensors.  Compressing all sensor data to a single line would also help improve the speed.

Another useful feature would be to record the sensor data rather than just having it be visible from the web page.  The ability to store all of the written JSON files and replay the run from the robot’s perspective would make reviewing the data multiple times possible.