ICRS: Power Board Prototype Design

SchematicBoard

The design for the board to supply power to both the Pi Zero and robot modules is fairly simple. It’s two components are a single-cell, LiPo charging circuit based on the MCP73831 and the Pololu adjustable step-up regulator. The charging circuit is fairly straightforward so I’ll just explain the step-up circuit which takes the output voltage from the single cell battery, passes it to the four pin input side of the step-up regulator, and gets back a boosted voltage from the second four pin output side of the regulator.

I also added four beefy diodes between the regulator and the 5V supply for the system. My reasoning behind this was to provide the ability to use multiple power boards, and therefore multiple LiPo batteries, in the system. Each regulator is limited to roughly three amps of current with the multiple boards, Arduinos, and the Pi Zero taking up a fair chunk of that. I was worried that the regulator wouldn’t be able to supply enough current to the motors and so I added the diodes so that multiple power boards could be safely put in parallel. I also decided to use an adjustable voltage regulator so that the output voltage of the regulator could be set higher than 5V so that the actual voltage seen after the diodes would be 5V after the diode forward voltage.

ICRS: Motor Board Prototype Design

The first board I’ll go over is the Motor Board prototype. It’s fairly simple; excluding the Arduino that handles communication and control it only consists of encoder connectors and a dual H-bridge. My planned use for this prototype will be to control the main drive motors of the robot base module. The schematic and PCB are shown below.

Schematic

Board

H-Bridge

I’m using an L293D chip for the motor driver. As shown in Figure 10 of the datasheet, each half of the chip can function as a bidirectional motor controller. By driving PWM signals to the control inputs of the chip, the speed of the motors can also be controlled. Unfortunately, each half of the bridge can only handle up to 600mA, which is relatively low, but it’ll sufficient for controlling the basic motors I plan on using on the prototype robot.

Encoders

The encoders I’m using are KY-040 Rotary Encoders which will measure the number of rotations of the motors and provide feedback to the motor controller. The linked description explains how the encoders work better than I can, but essentially the Arduino will measure the speed and number of rotations that the motor is travelling at and apply more or less current to the motor to get it to the desired end position. I plan on using a basic PID loop for this which I will cover in a later post.

In addition to the encoder connectors on the left, I’ve also added a basic debounce circuit in the top left of the schematic. Because these encoders use mechanical switches, they’re subject to mechanical bouncing of the switch contacts and so I plan on using a capacitor to act as a low pass filter and absorb these bounces and clean up the encoder signal.

ICRS: Core Module Design

Before doing the circuit design and PCB layout I wanted to briefly outline the architecture of the core robot modules and the communication between them. The core modules are the ones that will be present on every robot in the swarm and provides the critical functionality that’s required for the robot to operate. To make programming easier and reduce load on the main processor I’ve decided that each module will have a processor that will intelligently communicate with the main board and abstract away as many unnecessary details as possible.

Main Processor

The main board will be a single board computer that will do all of the “thinking” for the whole robot. It will handle communication to the swarm master and pass information between each of the modules within a robot. Due to the large community and low cost I’ve decided to use the Pi Zero W as the main processor. I’ll be using the GPIO header as the main connector and each module will be connected together through this common header. Each of the submodules will be I2C slaves and will be directed by the Raspberry Pi on the I2C bus. The Pi Zero W also has the benefit of built-in WiFi which further reduces the cost and complexity of the robot.

Power Module

The power module will be fairly dumb with it’s only task being to supply power to the whole module stack. For the first iteration I only plan on measuring battery voltage/percentage via the onboard processor but I may expand these capabilities in the future to include things like power usage, current, battery health, etc.

Motor Controller

The motor controller board will be responsible for the actual movement of the robot. My initial design will have the ability to control two motors as well as encoders to close the loop and verify that the robot has moved where expected. As I mentioned in my previous post, encoders are subject to drift so these encoders will verify that the robot movement is in the right ball park and will only be used for a single move. This means that the encoder position will be reset after each move is completed.

I plan on having this module be intelligent enough to handle all coordinated motion without intervention from the main processor board. The Raspberry Pi should be able to send the board an XY position, say “Go here,” and have the motor board handle the rest. It will also be able to query this submodule for certain status such as current position, status of the move, etc.

Localization

Due to the relative complexity of the localization algorithm, as outlined in my previous post, the localization board will be the most complicated. It required IR LEDs for transmitting the sync signal, IR receivers for receiving another robot’s sync signal, and ultrasonic transducers for sending and receiving localization pings.

This submodule won’t be as independent as the motor controller board due to the inter-robot communication requirements of the localization algorithm, but the interface should be simple. The board will only take one command, to send out a ping, so it knows to trigger the sync signal and localization signal on the IR LED and ultrasonic transmitter. Otherwise the board will constantly wait to receive an IR sync signal and will calculate distance from the transmitting robot based on how long it is until the localization signal arrives. The main processor board will then be able to query the localization board for the distance to the transmitting robot.

With the basics of the submodules laid out the next step is designing the circuits and PCBs!

PiGRRL Switch: Test Controller in Action!

2017-03-18 12.55.34

So I didn’t get a chance to take a video of the controller in action, but I do have a picture of me playing the original Zelda using my soldered up controller prototype. The Raspberry Pi and LCD is hooked up to one battery bank and the controller is hooked up to a separate one. Going through this test I noticed a few things that can be improved.

For starters, I had a hellish time getting enough power to both the Raspberry Pi and LCD. For whatever reason the power supply didn’t seem to be able to give enough juice to keep the Raspberry Pi from rebooting. The display has two micro USB ports on it that are used to power the screen and provide touchscreen feedback to the Pi. However since I currently don’t need to use the touchscreen I just powered the screen directly from the battery bank. To circumvent these power issues in the future I’ll need to see if I can get the Pi to provide more current from the USB ports. If that doesn’t work I can use one of the micro USB ports for touchscreen feedback and cut the red power wire to prevent the screen from drawing power from that port and use the second USB port to power the screen directly from the battery bank.

Another issue I noticed with my controller is the lack of menu buttons. The RetroPie interface requires using Start and Select buttons for navigation, so I had to map those to an attached keyboard since I only have four action buttons and four directional buttons on my prototype. Additionally I’ll also want shoulder buttons since the later consoles all have at least two and the PS1 has four.

The next step for the controller interface is designing the PCB!

PiGRRL Switch: Creating the Controllers

With the screen chosen and working, the next step for creating the PiGRRL Switch was prototyping the controllers.  Initially, I wanted something cheap and easy like a breakout board that acted as a Bluetooth HID joystick.  I was immediately drawn to Adafruit’s EZ-Key which acts as a Bluetooth keyboard.  At the time I’m writing this, however, it’s out of stock and seems to have been for a while.  Additionally, because it acts as a Bluetooth keyboard and not a joystick, it rules out any possibility of adding analog controls in the future.

Another alternative to a Bluetooth HID breakout would be taking apart a cheap Bluetooth joystick and putting it in a 3D printed casing.  However, I decided this would greatly reduce the design flexibility of the controllers and might make it difficult to reconfigure the controllers on the fly (i.e. using two JoyCons as one controller vs. using them as two separate controllers).

So with those two options off the table I decided instead to use a Bluetooth serial bridge.  The HM-10 BLE and HC-05 Bluetooth 2.0 modules are both cheap and plentiful and provide a good solution at the cost of some extra work.  These modules can be hooked up to the UART of an Arduino and paired via Bluetooth.  Once connected, it acts as a virtual serial port in Linux, allowing the serial data to be read just as if the Arduino was connected via USB or FTDI.  The only exception to this is that it doesn’t support firmware loading wirelessly.

2017-03-13 22.18.32

The next step was setting up the initial design on a breadboard.  Above is an Arduino Pro Mini, four pushbuttons wired to the digital pins, and the HM-10 BLE module.  I decided to use the HM-10 because of the lower power requirements (BLE being an initialism for Bluetooth Low Energy).  The code for the Arduino reads the values from the digital pins and prints out eight characters to signify which buttons are pressed (‘1’ for unpressed and ‘0’ for pressed).  Right now I’m using a byte for each button which is wasteful, so I’ll go back at some point in the future and make the code more efficient so each button is represented by a bit.

2017-03-14 22.50.16

Once everything was wired up and running I had a lot of trouble finding an app that could connect to the HM-10 as a serial terminal.  Apparently the BLE standard has a lot of bells and whistles that make configuration a bit more difficult.  After trying several different apps I eventually found Serial Bluetooth Terminal which can connect to both BLE and regular Bluetooth devices via a serial terminal.  Above is screenshot of my phone connected to the controller with the button status being transmitted.

2017-03-14 20.31.24

2017-03-14 20.31.37

With my proof of concept working, I soldered everything onto a proto-board, this time with eight buttons to serve as a D-pad and four action buttons.

With that complete the next step was connecting to the Raspberry Pi over a serial terminal.  Unfortunately, this was much more difficult than I expected.  I could pair and connect to the HM-10, but couldn’t find a way to mount it as a serial terminal.

2017-03-15 21.01.17

Rather than continue further down the rabbit hole, I decided to drop the BLE module for now and switch to the HC-05 modules I bought as a backup.  Those have been around for years and have been used extensively with Arduino and Raspberry Pi.  Once that module was paired and connected, mounting it as a serial terminal was as simple as using the following commands to connect and then print out the values read from the module:

sudo rfcomm bind /dev/rfcomm0 <MAC Address>
sudo cat /dev/rfcomm0

2017-03-17 19.20.25

2017-03-17 22.01.22

Lastly I connected the controller, screen, and Raspberry Pi to battery packs and verified everything still worked as suspected.  Success!  The next step is writing a program for Linux that reads the button data coming off the serial port and uses it to emulate a controller for the console.

HAL 9000: Wiring

BOM

Wiring

This slideshow requires JavaScript.

With the Raspberry Pi as the centerpiece, I went about connecting everything together.  The first step was wiring up the sound.  I took a stereo audio cable plugged into the Raspberry Pi’s audio port and wired each of the left and right channels into its own amplifier.  The power and ground of both amplifiers was sourced from the Raspberry Pi’s 5V and GND pins on the GPIO header.  I then wired the outputs of one of the amplifiers to the speaker.  The outputs of the other amplifier were wired to the LED in the button.  By doing this, the light inside of HAL’s eye would flash in sync with the audio being played.  Aside from that, all that was left to do was plug in the USB microphone and I was done.  One optional addition I might make in the future is wiring up the inputs of the button.  This would provide the possibility to activate Alexa via means other than the wake word.

HAL 9000: Alexa Install

I originally thought this blog post was going to be a lengthy explanation of how to install the Alexa software (found here) on the Raspberry Pi 3 with all of the caveats, tweaking, and reconfiguration necessary to get the software to install.  Any Linux user who frequently installs software from source knows that the time it takes to get some software to compile and install is exponentially proportional to the complexity of the code and the compile time.  This is not the case here.

It did take roughly an hour to run the automated install script provided in the Alexa repository, but once that had completed everything ran almost perfectly right out of the box.  I’m utterly floored by this, and am incredibly impressed with the Alexa development team on the quality of their software project.  So really, if this is something you’re interested in doing, use their guide to set up everything.  All you really need is a Raspberry Pi 3, a microphone (I used this one), and a speaker (I used one from Adafruit which I’ll discuss in detail in my post on wiring).  The only thing I had to tweak was forcing the audio to be output on the 3.5mm jack using raspi-config and selecting the jack in Advanced Options->Audio.

And without further ado, my working example.

HAL 9000: Overview

A HAL 9000 replica has been on my “to make” list since Adafruit started stocking their massive, red arcade button.  They even created a tutorial for building a HAL replica!  When the Alexa developers added support for a wake word last month, I knew I had to build it.  Rather than simply playing sound effects with the Pi, I wanted to include Amazon’s new Alexa sample that allows to run the Amazon Echo software on the Raspberry Pi 3.  Always a fan I tempting fate, I thought the HAL replica would be the perfect container for a voice assistant that has access to all of my smart home appliances.  What could go wrong?

OpenADR: Data Visualization

As the navigational algorithms used by the robot have gotten more complex, I’ve noticed several occasions where I don’t understand the logical paths the robot is taking.  Since the design is only going to get more complex from here on out, I decided to implement real time data visualization to allow me to see what the robot “sees” and view what actions it’s taking in response.  With the Raspberry Pi now on board, it’s incredibly simple to implement a web page that hosts this information in an easy to use format.  The code required to enable data visualization is broken up into four parts; the firmware part to send the data, the serial server to read the data from the Arduino, the web server to host the data visualization web page, and the actual web page that displays the data.

Firmware


// Prints "velocity=-255,255"
Serial.print("velocity=");
Serial.print(-FULL_SPEED);
Serial.print(",");
Serial.println(FULL_SPEED);
// Prints "point=90,25" which means 25cm at the angle of 90 degrees
Serial.print("point=");
Serial.print(AngleMap[i]);
Serial.print(",");
Serial.println(Distances[i]);

view raw

DataPrint.ino

hosted with ❤ by GitHub

The firmware level was the simplest part.  I only added print statements to the code when the robot changes speed or reads the ultrasonic sensors.  It prints out key value pairs over the USB serial port to the Raspberry Pi.

Serial Server


from PIL import Image, ImageDraw
import math
import time
import random
import serial
import json
# The scaling of the image is 1cm:1px
# JSON file to output to
jsonFile = "data.json"
# The graphical center of the robot in the image
centerPoint = (415, 415)
# Width of the robot in cm/px
robotWidth = 30
# Radius of the wheels on the robot
wheelRadius = 12.8
# Minimum sensing distance
minSenseDistance = 2
# Maximum sensing distance
maxSenseDistance = 400
# The distance from the robot to display the turn vector at
turnVectorDistance = 5
# Initialize global data variables
points = {}
velocityVector = [0, 0]
# Serial port to use
serialPort = "/dev/ttyACM0"
robotData = {}
ser = serial.Serial(serialPort, 115200)
# Parses a serial line from the robot and extracts the
# relevant information
def parseLine(line):
status = line.split('=')
statusType = status[0]
# Parse the obstacle location
if statusType == "point":
coordinates = status[1].split(',')
points[int(coordinates[0])] = int(coordinates[1])
# Parse the velocity of the robot (x, y)
elif statusType == "velocity":
velocities = status[1].split(',')
velocityVector[0] = int(velocities[0])
velocityVector[1] = int(velocities[1])
def main():
# Three possible test print files to simulate the serial link
# to the robot
#testPrint = open("TEST_straight.txt")
#testPrint = open("TEST_tightTurn.txt")
#testPrint = open("TEST_looseTurn.txt")
time.sleep(1)
ser.write('1');
#for line in testPrint:
while True:
# Flush the input so we get the latest data and don't fall behind
#ser.flushInput()
line = ser.readline()
parseLine(line.strip())
robotData["points"] = points
robotData["velocities"] = velocityVector
#print json.dumps(robotData)
jsonFilePointer = open(jsonFile, 'w')
jsonFilePointer.write(json.dumps(robotData))
jsonFilePointer.close()
#print points
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print "Quitting"
ser.write('0');

view raw

drawPoints.py

hosted with ❤ by GitHub

The Raspberry Pi then runs a serial server which processes these key value pairs and converts them into a dictionary representing the robot’s left motor and right motor speeds and the obstacles viewed by each ultrasonic sensor.  This dictionary is then converted into the JSON format and written out to a file.


{"velocities": [-255, -255], "points": {"0": 10, "90": 200, "180": 15, "45": 15, "135": 413}}

view raw

data.json

hosted with ❤ by GitHub

This is an example of what the resulting JSON data looks like.

Web Server


import SimpleHTTPServer
import SocketServer
PORT = 80
Handler = SimpleHTTPServer.SimpleHTTPRequestHandler
httpd = SocketServer.TCPServer(("", PORT), Handler)
print "serving at port", PORT
httpd.serve_forever()

view raw

webserver.py

hosted with ❤ by GitHub

Using a Python HTTP server example, the Raspberry Pi runs a web server which hosts the generated JSON file and the web page that’s used for viewing the data.

Web Page


<!doctype html>
<html>
<head>
<script type="text/javascript" src="http://code.jquery.com/jquery.min.js"></script>
<style>
body{ background-color: white; }
canvas{border:1px solid black;}
</style>
<script>
$(function()
{
var myTimer = setInterval(loop, 1000);
var xhr = new XMLHttpRequest();
var c = document.getElementById("displayCanvas");
var ctx = c.getContext("2d");
// The graphical center of the robot in the image
var centerPoint = [415, 415];
// Width of the robot in cm/px
var robotWidth = 30;
// Radius of the wheels on the robot
wheelRadius = 12.8;
// Minimum sensing distance
var minSenseDistance = 2;
// Maximum sensing distance
var maxSenseDistance = 400;
// The distance from the robot to display the turn vector at
var turnVectorDistance = 5
// Update the image
function loop()
{
ctx.clearRect(0, 0, c.width, c.height);
// Set global stroke properties
ctx.lineWidth = 1;
// Draw the robot
ctx.strokeStyle="#000000";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], (robotWidth / 2), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's minimum sensing distance as a circle
ctx.strokeStyle="#00FF00";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + minSenseDistance), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's maximum sensing distance as a circle
ctx.strokeStyle="#FFA500";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + maxSenseDistance), 0, 2*Math.PI);
ctx.stroke();
xhr.onreadystatechange = processJSON;
xhr.open("GET", "data.json?" + new Date().getTime(), true);
xhr.send();
//var robotData = JSON.parse(text);
}
function processJSON()
{
if (xhr.readyState == 4)
{
var robotData = JSON.parse(xhr.responseText);
drawVectors(robotData);
drawPoints(robotData);
}
}
// Calculate the turn radius of the robot from the two velocities
function calculateTurnRadius(leftVector, rightVector)
{
var slope = ((rightVector leftVector) / 10.0) / (2.0 * wheelRadius);
var yOffset = ((Math.max(leftVector, rightVector) + Math.min(leftVector, rightVector)) / 10.0) / 2.0;
var xOffset = 0;
if (slope != 0)
{
xOffset = Math.round((yOffset) / slope);
}
return Math.abs(xOffset);
}
// Calculate the angle required to display a turn vector with
// a length that matched the magnitude of the motion
function calculateTurnLength(turnRadius, vectorMagnitude)
{
var circumference = 2.0 * Math.PI * turnRadius;
return Math.abs((vectorMagnitude / circumference) * (2 * Math.PI));
}
function drawVectors(robotData)
{
leftVector = robotData.velocities[0];
rightVector = robotData.velocities[1];
// Calculate the magnitude of the velocity vector in pixels
// The 2.5 dividend was determined arbitrarily to provide an
// easy to read line
var vectorMagnitude = (((leftVector + rightVector) / 2.0) / 2.5);
ctx.strokeStyle="#FF00FF";
ctx.beginPath();
if (leftVector == rightVector)
{
var vectorEndY = centerPoint[1] vectorMagnitude;
ctx.moveTo(centerPoint[0], centerPoint[1]);
ctx.lineTo(centerPoint[0], vectorEndY);
}
else
{
var turnRadius = calculateTurnRadius(leftVector, rightVector);
if (turnRadius == 0)
{
var outsideRadius = turnVectorDistance + (robotWidth / 2.0);
var rotationMagnitude = (((Math.abs(leftVector) + Math.abs(rightVector)) / 2.0) / 2.5);
var turnAngle = calculateTurnLength(outsideRadius, rotationMagnitude);
if (leftVector < rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI) turnAngle, (1.5 * Math.PI));
}
if (leftVector > rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI), (1.5 * Math.PI) + turnAngle);
}
}
else
{
var turnAngle = 0;
if (vectorMagnitude != 0)
{
turnAngle = calculateTurnLength(turnRadius, vectorMagnitude);
}
// Turning forward and left
if ((leftVector < rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, turnAngle, 0);
}
// Turning backwards and left
else if ((leftVector > rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, 0, turnAngle);
}
// Turning forwards and right
else if ((leftVector > rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI, Math.PI + turnAngle);
}
// Turning backwards and right
else if ((leftVector < rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI turnAngle, Math.PI);
}
}
}
ctx.stroke();
}
function drawPoints(robotData)
{
for (var key in robotData.points)
{
var angle = Number(key);
var rDistance = robotData.points[angle];
var cosValue = Math.cos(angle * (Math.PI / 180.0));
var sinValue = Math.sin(angle * (Math.PI / 180.0));
var xOffset = ((robotWidth / 2) + rDistance) * cosValue;
var yOffset = ((robotWidth / 2) + rDistance) * sinValue;
var xDist = centerPoint[0] + Math.round(xOffset);
var yDist = centerPoint[1] Math.round(yOffset);
ctx.fillStyle = "#FF0000";
ctx.fillRect((xDist 1), (yDist 1), 2, 2);
}
}
});
function changeNavigationType()
{
var navSelect = document.getElementById("Navigation Type");
console.log(navSelect.value);
var http = new XMLHttpRequest();
var url = "http://ee-kelliott:8000/&quot;;
var params = "navType=" + navSelect.value;
http.open("GET", url, true);
http.onreadystatechange = function()
{
//Call a function when the state changes.
if(http.readyState == 4 && http.status == 200)
{
alert(http.responseText);
}
}
http.send(params);
}
</script>
</head>
<body>
<div>
<canvas id="displayCanvas" width="830" height="450"></canvas>
</div>
<select id="Navigation Type" onchange="changeNavigationType()">
<option value="None">None</option>
<option value="Random">Random</option>
<option value="Grid">Grid</option>
<option value="Manual">Manual</option>
</select>
</body>
</html>

view raw

index.html

hosted with ❤ by GitHub

 

The visualization web page uses an HTML canvas and JavaScript to display the data presented in the JSON file.  The velocity and direction of the robot is shown by a curved vector, the length representing the speed of the robot and the curvature representing the radius of the robot’s turn.  The obstacles seen by the five ultrasonic sensors are represented on the canvas by five points.

DataImage

The picture above is the resulting web page without any JSON data.

Here’s a video of the robot running alongside a video of the data that’s displayed.  As you can see, the data visualization isn’t exactly real time.  Due to the communication delays over the network some of the robot’s decisions and sensor readings get lost.

Further Steps

Having the main structure of the data visualization set up provides a powerful interface to the robot that can be used to easily add more functionality in the future.  I’d like to add more data items as I add more sensors to the robot such as the floor color sensor, an accelerometer, a compass, etc.  At some point I also plan on creating a way to pass data to the robot from the web page, providing an interface to control the robot remotely.

I’d also like to improve the existing data visualization interface to minimize the amount of data lost due to network and processing delays.  One way I could do this would be by getting rid of the JSON file and streaming the data directly to the web page, getting rid of unnecessary file IO delays.  The Arduino also prints out the ultrasonic sensor data one sensor at a time, requiring the serial server to read five lines from the serial port to get the reading of all the sensors.  Compressing all sensor data to a single line would also help improve the speed.

Another useful feature would be to record the sensor data rather than just having it be visible from the web page.  The ability to store all of the written JSON files and replay the run from the robot’s perspective would make reviewing the data multiple times possible.

OpenADR: Connecting to Raspberry Pi

2016-06-11 23.25.45

The most frustrating part of developing the wall following algorithm from my last post was the constant moving back and forth between my office, to tweak and load firmware, and test area (kitchen hallway).  To solve this problem, and to overall streamline development, I decided to add a Raspberry Pi to the robot.  Specifically, I’m using a Raspberry Pi 2 that I had lying around, but expect to switch to a Pi Zero once I get code the code and design in a more final state.  By installing the Arduino IDE and enabling SSH, I’m now able to access and edit code wirelessly.

Having a full-blown Linux computer on the robot also adds plenty of opportunity for new features.  Currently I’m planning on adding camera support via the official camera module and a web server to serve a web page for manual control, settings configuration, and data visualization.

While I expected hooking up the Raspberry Pi as easy as connecting to the Arduino over USB, this turned out to not be the case.  Having a barebones SBC revealed a few problems with my wiring and code.

The first issue I noticed was the Arduino resetting when the motors were running, but this was easily attributable to the current limit on the Raspberry Pi USB ports.  A USB 2.0 port, like those on the Pi, can only supply up to 500mA of current.  Motors similar to the ones I’m using are specced at 250mA each, so having both motors accelerating suddenly to full speed caused a massive voltage drop which reset the Arduino.  This was easily fixed by connecting the motor supply of the motor controller to the 5V output on the Raspberry Pi GPIO header.  Additionally, setting the max_usb_current flag in /boot/config.txt allows up to 2A on the 5V line, minus what the Pi uses.  2A should be more than sufficient once the motors, Arduino, and other sensors are hooked up.

The next issue I encountered was much more nefarious.  With the motors hooked up directly to the 5V on the Pi, changing from full-speed forward to full-speed backward caused everything to reset!  I don’t have an oscilloscope to confirm this, but my suspicion was that there was so much noise placed on the power supply by the motors that both boards were resetting.  This is where one of the differences between the Raspberry Pi and a full computer is most evident.  On a regular PC there’s plenty of space to add robust power circuitry and filters to prevent noise on the 5V USB lines, but because space on the Pi is at a premium the minimal filtering on the 5V bus wasn’t sufficient to remove the noise caused by the motors.  When I originally wrote the motor controller library I didn’t have motor noise in mind and instead just set the speed of the motor instantly.  In the case of both motors switching from full-speed forward to full-speed backward, the sudden reversal causes a huge spike in the power supply, as explained in this app note.  I was able to eventually fix this by rewriting the library to include some acceleration and deceleration.  By limiting the acceleration on the motors, the noise was reduced enough that the boards no longer reset.

While setting up the Raspberry Pi took longer than I’d hoped due to power supply problems, I’m glad I got a chance to learn the limits on my design and will keep bypass capacitors in mind when I design a permanent board.  I’m also excited for the possibilities having a Raspberry Pi on board provides and look forward to adding more advanced features to OpenADR!