OpenADR: Mop Design Decisions

In my last post, I described the beginnings of the first module for OpenADR, the vacuum.  With the Automation round of the Hackaday Prize contest ending this weekend, though, I decided to start working on a second module, a mop, before perfecting the vacuum module.  The market for robotic vacuum cleaners is looking pretty crowded these days, and most of the design kinks have been worked out by the major manufacturers.  Robotic mops, on the other hand, are far less common with the only major ones being the Scooba  and Braava series by iRobot.  Both of these robots seem to have little market penetration at this point, so the jury’s still out on what consumers want in a robotic mop.

I’ve been thinking through the design of this module for a while now. The design for the vacuum module was simple enough; all it required was a roller to disturb dirt and a fan to suck it in. Comparatively, the mop module will be much more complex.  I don’t plan on having any strict design goals yet for the mop like I did with the vacuum given that the market is still so new.  Instead, I’ll be laying out some basic design ideas for my first implementation.

The basic design I envision is as follows: water/cleaning solution gets pumped from a tank onto the floor, where it mixes with dirt and grime.  This dirty liquid is then scrubbed and mopped up with an absorbent cloth.  I know that probably sounds fairly cryptic now, but I’ll describe my plans for each stage of this process below.

Water Reservoir

Both the Scooba 450 and Braava Jet have tanks (750mL and 150mL, respectively) that they use to store cleaning solution or water for wetting the floor.  The simplest way to add a tank to the mop module would be to just integrate a tank into the module’s 3D printed design that I described in an earlier post.  This is a little risky, however, as 3D printed parts can be difficult to make water tight (as evidenced by my struggles with sustainable sculptures).  Placing the robot’s electronics and batteries near a reservoir of water has to potential to be disastrous.  A much safer bet would be to use a pre-made container or even a cut plastic bottle.

Being an optimist, however, I’d rather take the risk on the 3D printed tank to take advantage of the customizability and integration that it would provide.  In the case of the sculptures, I wanted to keep the walls thin and transparent.  I won’t have such strict constraints in this case and can use a much more effective sealant to waterproof the tank.  And just to be on the safe side, I can include small holes in the bottom of the chassis (i.e., around the tank) near any possible leaks so the water drips out of the robot before it can reach any of the electronics.

 

Dispensing of Water

 

The next design decision is determining how to actually get the water from the tank to the floor.  While I looked for an easily sourceable water pump, I couldn’t find a cheap one that was small enough to fit well in the chassis.  Luckily there are some absolutely amazing, customizeable, 3D printed pumps on Thingiverse that I can use instead!

Disturbing Dirt

The biggest complaint when it comes to robot mops seem to be a lack of effectiveness when it comes to scrubbing dirt, especially with dirt trapped in the grout between tiles.  The Braava uses a vibrating cloth pad to perform its scrubbing while the Scooba seems to use one of the brushed rollers from a Roomba.  Both of these options seem to be lacking based on users’ reviews; the best option would be to use scrubbing brushes designed especially for use with water (rather than the Roomba’s, which are designed to disturb carpet fibers during vacuuming). As with the vacuum module, however, I had a hard time finding bristles or brushes to integrate into my design.  Unfortunately using a roller made of flexible filament (i.e., my solution for the vacuum module) isn’t an option in this case, since it’s not capable of the same kind of scrubbing efficacy as a regular mop.

For my first version, I’m just going to use a microfiber cleaning cloth.  This has the benefit of being washable and reusable, unlike the cleaning pads on the Braava, and yet I can still achieve some scrubbing functionality by mounting the cleaning cloth to a rotary motor.

Water Recovery

A mop that leaves dirty water on the floor isn’t a very effective mop, so some sort of water and dirt recovery is required.  The Scooba uses a vacuum and squeegee to suck the water off of the floor back into a wastewater tank.  The Braava’s cleaning pad, on the other hand, serves double duty and acts as both a scrubber and sponge to soak up the dirty water.  Both of these options seem perfectly valid, but the Braava’s method seems like an easier implementation for a first revision.  It’s also the method that conventional mops use.  The microfiber cloth I decided to use for scrubbing can also serve to absorb the water and dirt from the floor.

It’s important to note, however, that using the absorption method for water recovery limits the robot’s water capacity and the amount of floor it can clean.  The mop could have a 10L water reservoir, but if the cloth can only absorb 100mL of there will still be 9.9L of water left on the floor.  The Braava only has a 150mL tank and 150sqft. of range because its cleaning pad can only hold 150mL of water.  I’ll have to do some testing on the microfiber cloths I use to determine the maximum capacity of the mop module.

Next Steps

Designing and printing out the mop module!

 

OpenADR: Vacuum Test #1

Now that the vacuum module is mostly assembled, it’s time for the first test! I basically just wanted to make sure that there was enough suction power generated by the fan to pull in dirt and debris. Here’s how the module looks so far:

IMG_0620.JPG

As I mentioned in my previous post, I didn’t design a lid yet for the vacuum module because I wanted to use a clear coating on the top for now.  Having the interior of the dust bin visible will make it easier to test and view what is going on inside.  For now, I’ve sealed the top of the dust bin by taping on a cut up Ziploc bag.

The blower fan is rated for 12V, so I have it wired directly to my 12V bench supply using alligator clips.

IMG_0623
Standard dog hair
IMG_0401
Standard dog

 

 

 

 

 

 

The test itself was performed on standard dog hair (since I have so much of it lying around). I had to feed the hair directly into the dust bin input because the vacuum module isn’t yet attached to the main robot chassis and so there’s no direct airflow channel that passes through the roller assembly and into the dust bin.  I’m considering integrating the roller assembly directly into the vacuum’s body so the whole module is self-contained and the complete path of dust through the vacuum can be tested without having to attach it to the main chassis.

So the first test proved moderately successful!  The hair did get slightly stuck, but that can mostly be attributed to the flexible Ziploc bag material being sucked downward, thereby decreasing the height of the opening where the hair entered the dust bin.  For the next revision I’m probably going to curve the input air channel so hair and dust isn’t making so any 90° turns.  Next up, testing the whole thing as part of the main chassis!

OpenADR: Vacuum Module v0.1

Now that the navigation functionality of the main chassis is mostly up and running, I’ve transitioned to designing modules that will fit into the chassis and give OpenADR all the functions it needs (see my last post).  The first module I’ve designed and built is the vacuum, since it’s currently the most popular implementation of domestic robotics in the market.  Because this is my first iteration of the vacuum (and because my wife is getting annoyed at the amount of dust and dog hair I’ve left accumulating on the floor “for testing purposes”), I kept the design very simplistic: just the roller, the body (which doubles as the dust bin), and the fan.

Roller Assembly

IMG_0513.JPG

The brush assembly is the most complicated aspect of the vacuum.  In lieu of finding an easily sourceable roller on eBay, I opted to design the entire assembly from scratch.  I used the same type of plain yellow motors that power the wheels on the main chassis to drive the roller.

 

The rollers themselves consist of two parts, the brush and the center core.  The brush is a flexible sleeve, printed with the same TPU filament used for the navigation chassis’s tires, that has spiraling ridges on the outside to disturb the carpet and knock dust and dirt particles loose.  The center core is a solid cylinder with a hole on one end for the motor shaft and a protruding smaller cylinder on the other that is used as an axle.   One roller is mounted on either side of the module and are driven by the motor in the center.

IMG_0617.JPG

To print the vacuum module, I had to modify the module base design that I described in my last post. I shortened the front, where the brush assembly will go, so that the dust will be sucked up between the back wall of the main chassis and the front of the vacuum module’s dust bin and be deposited in the dust bin.

Fan Mounting

IMG_0519

For the fan, I’ll be using Sparkfun’s squirrel blower. I plan to eventually build a 3D model of the fan so that it fits snugly in the module, but in the meantime, the blower mount is just a hole in the back of the module where the blower outlet will be inserted and hot-glued into place. In the final version, I will include a slot for a carbon filter in this mount, but given that I’m just working with a hole for the blower outlet in this first version, I cut up an extra carbon filter from my Desk Fume Extractor and taped that to where the air enters the blower to make sure dust doesn’t get inside the fan.

IMG_0525

The blower itself is positioned at the top of the dust bin with the inlet (where the air flows in) pointed downwards.  Once the blower gets clogged, the vacuum will no longer suck (or will it now suck?), so I positioned the inlet as high as possible on the module to maximize the space for debris in the dust bin before it gets clogged.

Dust Bin

The rest of the module is just empty space that serves as the vacuum’s dust bin.  I minimized the number of components inside this dust bin area to reduce the risk of dust and debris causing problems.  With the roller assembly placed outside the bin on the front of the module, the only component that will be inside of the dust bin is the blower.

With a rough estimate of the dimensions of the dust bin, the vacuum module has the potential to hold up to a 1.7L! This is assuming that the entire dust bin is full, which might not be possible, but is still substantially more than the 0.6L of the Roomba 980 and 0.7L of the Neato Botvac.

Future Improvements

There are a few things I’d like to improve in the next version of the vacuum module since this is really just alpha testing still. The first priority is designing a fan mount that fits the blower and provides the proper support.  Going hand in hand with this, the filter needs an easily accessible slot to slide in before the fan input (as opposed to the duct tape I am using now).

I also want to design and test several different types of rollers in order to compare efficiency.  The roller I’m using now turned out much stiffer than I’d like so, at the very least, I need to redesign them to be more flexible.  Alternatively, I could go with something more like the Roomba’s Aeroforce rollers, which decrease the cross-sectional area of the air passage and thereby increase the air velocity.  These rollers offer better suction and less opportunity for hair to get wrapped around the rollers but are a little less effective for thicker carpets.

Further, I need to make sure that the dust bin is in fact air-tight so that dust isn’t getting into the main chassis or back onto the floor.  I included bolt mounts on the floor of the dust bin to connect the separate pieces together, but I don’t have mounts on the walls of the dust bin, and so I am using tape around the top of the bin to hold the pieces together for now.  Since any holes in the dust bin provide opportunity for its contents to leak onto the floor, making sure I have a good seal here is critical.  In the future I’d like to redesign these seams so that they are sealed more securely, possibly by using overlapping side walls.

Lastly, the vacuum module needs a lid.  For the current version I intentionally left out the lid so that see everything while I’m testing. I plan to add a transparent covering to this version for that purpose (and so dust doesn’t go flying everywhere!). In the final version, the lid will need to provide a good seal and be easily removable so that the dust bin can be emptied.

But before we do all that, let’s test this vacuum!

OpenADR: On Modularity, Part 2

While I’ve been working primarily on the vacuum component of OpenADR, my eventual goal is for this to be just one of several, interchangeable modules that the robot can operate with.  By making the whole thing modular, I can experiment with a range of functions without having to recreate the base hardware that handles movement and navigation (i.e., the hard stuff!).  Today I wanted to share a bit more about how I’m building in this functionality, even though I’m only working on one module for now.

IMG_0482

The OpenADR modules will plug into the opening that I have left in the main chassis.  The modules will slide into the missing part of the chassis (shown in the picture above) to make the robot a circle when fully assembled.  The slot where the module will be inserted is a 15o x 150 mm square in the center and a quarter of the 300 mm diameter circle of the whole robot.  The picture below might give you a better sense of what I’m describing.

ModuleBase.png

While each of the modules will be different, the underlying design will be the same.  This way, regardless of which module you need to use (e.g., vacuuming, mopping, dusting), everything should fit nicely in the same main chassis with minimal modifications needed.

To aid in the design of the separate modules, I’ve created a baseline OpenSCAD model that fits into the main chassis.  The model is broken up into four pieces in order to make printing the parts easier, and I’ve included bolt mounts to attach them together.  The model also includes tracks that allow the module to slide into place against the ridges that I have added to the adjacent walls of the main chassis.  I’ll build off of this model to create each module to be sure that everything is easily interchangeable and fits smoothly (especially with my new filament!).

The great thing about OpenADR being modular is that I can always add new modules based on what would be useful to those using it.  So this is where I need your help.  What functionality would you like to see?  Are there cleaning supplies or techniques you use regularly on your floors that could be automated?

OpenADR: Long Term Plans

ProductHierarchyWith the beginning of the Automation round beginning today, I decided to sketch out some of the long term plans I have for OpenADR.  All my updates so far have referenced it as a robot vacuum, with a navigation module and vacuum module that have to be connected together.

The way I see it, though, the navigation module will be the core focus of the platform with the modules being relatively dumb plug-ins that conform to a standard interface.  This makes it easy for anyone to design a simple module.  It’s also better from a cost perspective, as most of the cost will go towards the complex navigation module and the simple plug-ins can be cheap.  The navigation module will also do all of the power conversion and will supply several power rails to be used by the connected modules.

The modules that I’d like to design for the Hackaday Prize, if I have time, are the vacuum, mop, and wipe.  The vacuum module would provide the same functionality as a Roomba or Neato, the mop would be somewhere between a Scooba and Braava Jet, and the wipe would just be a reusable microfiber pad that would pick up dust and spills.

At some point I’d also like to expand OpenADR to have outdoor, domestic robots as well.  It would involve designing a new, bigger, more robust, and higher power navigation unit to handle the tougher requirements of yard work.  From what I can tell the current robotic mowers are sorely lacking, so that would be the primary focus, but I’d eventually like to expand to leaf collection and snow blowing/shoveling modules due to the lack of current offerings in both of those spaces.

Due to limited time and resources the indoor robotics for OpenADR will be my focus for the foreseeable future, but I’m thinking ahead and have a lot of plans in mind!

OpenADR: Data Visualization

As the navigational algorithms used by the robot have gotten more complex, I’ve noticed several occasions where I don’t understand the logical paths the robot is taking.  Since the design is only going to get more complex from here on out, I decided to implement real time data visualization to allow me to see what the robot “sees” and view what actions it’s taking in response.  With the Raspberry Pi now on board, it’s incredibly simple to implement a web page that hosts this information in an easy to use format.  The code required to enable data visualization is broken up into four parts; the firmware part to send the data, the serial server to read the data from the Arduino, the web server to host the data visualization web page, and the actual web page that displays the data.

Firmware


// Prints "velocity=-255,255"
Serial.print("velocity=");
Serial.print(-FULL_SPEED);
Serial.print(",");
Serial.println(FULL_SPEED);
// Prints "point=90,25" which means 25cm at the angle of 90 degrees
Serial.print("point=");
Serial.print(AngleMap[i]);
Serial.print(",");
Serial.println(Distances[i]);

view raw

DataPrint.ino

hosted with ❤ by GitHub

The firmware level was the simplest part.  I only added print statements to the code when the robot changes speed or reads the ultrasonic sensors.  It prints out key value pairs over the USB serial port to the Raspberry Pi.

Serial Server


from PIL import Image, ImageDraw
import math
import time
import random
import serial
import json
# The scaling of the image is 1cm:1px
# JSON file to output to
jsonFile = "data.json"
# The graphical center of the robot in the image
centerPoint = (415, 415)
# Width of the robot in cm/px
robotWidth = 30
# Radius of the wheels on the robot
wheelRadius = 12.8
# Minimum sensing distance
minSenseDistance = 2
# Maximum sensing distance
maxSenseDistance = 400
# The distance from the robot to display the turn vector at
turnVectorDistance = 5
# Initialize global data variables
points = {}
velocityVector = [0, 0]
# Serial port to use
serialPort = "/dev/ttyACM0"
robotData = {}
ser = serial.Serial(serialPort, 115200)
# Parses a serial line from the robot and extracts the
# relevant information
def parseLine(line):
status = line.split('=')
statusType = status[0]
# Parse the obstacle location
if statusType == "point":
coordinates = status[1].split(',')
points[int(coordinates[0])] = int(coordinates[1])
# Parse the velocity of the robot (x, y)
elif statusType == "velocity":
velocities = status[1].split(',')
velocityVector[0] = int(velocities[0])
velocityVector[1] = int(velocities[1])
def main():
# Three possible test print files to simulate the serial link
# to the robot
#testPrint = open("TEST_straight.txt")
#testPrint = open("TEST_tightTurn.txt")
#testPrint = open("TEST_looseTurn.txt")
time.sleep(1)
ser.write('1');
#for line in testPrint:
while True:
# Flush the input so we get the latest data and don't fall behind
#ser.flushInput()
line = ser.readline()
parseLine(line.strip())
robotData["points"] = points
robotData["velocities"] = velocityVector
#print json.dumps(robotData)
jsonFilePointer = open(jsonFile, 'w')
jsonFilePointer.write(json.dumps(robotData))
jsonFilePointer.close()
#print points
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print "Quitting"
ser.write('0');

view raw

drawPoints.py

hosted with ❤ by GitHub

The Raspberry Pi then runs a serial server which processes these key value pairs and converts them into a dictionary representing the robot’s left motor and right motor speeds and the obstacles viewed by each ultrasonic sensor.  This dictionary is then converted into the JSON format and written out to a file.


{"velocities": [-255, -255], "points": {"0": 10, "90": 200, "180": 15, "45": 15, "135": 413}}

view raw

data.json

hosted with ❤ by GitHub

This is an example of what the resulting JSON data looks like.

Web Server


import SimpleHTTPServer
import SocketServer
PORT = 80
Handler = SimpleHTTPServer.SimpleHTTPRequestHandler
httpd = SocketServer.TCPServer(("", PORT), Handler)
print "serving at port", PORT
httpd.serve_forever()

view raw

webserver.py

hosted with ❤ by GitHub

Using a Python HTTP server example, the Raspberry Pi runs a web server which hosts the generated JSON file and the web page that’s used for viewing the data.

Web Page


<!doctype html>
<html>
<head>
<script type="text/javascript" src="http://code.jquery.com/jquery.min.js"></script>
<style>
body{ background-color: white; }
canvas{border:1px solid black;}
</style>
<script>
$(function()
{
var myTimer = setInterval(loop, 1000);
var xhr = new XMLHttpRequest();
var c = document.getElementById("displayCanvas");
var ctx = c.getContext("2d");
// The graphical center of the robot in the image
var centerPoint = [415, 415];
// Width of the robot in cm/px
var robotWidth = 30;
// Radius of the wheels on the robot
wheelRadius = 12.8;
// Minimum sensing distance
var minSenseDistance = 2;
// Maximum sensing distance
var maxSenseDistance = 400;
// The distance from the robot to display the turn vector at
var turnVectorDistance = 5
// Update the image
function loop()
{
ctx.clearRect(0, 0, c.width, c.height);
// Set global stroke properties
ctx.lineWidth = 1;
// Draw the robot
ctx.strokeStyle="#000000";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], (robotWidth / 2), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's minimum sensing distance as a circle
ctx.strokeStyle="#00FF00";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + minSenseDistance), 0, 2*Math.PI);
ctx.stroke();
// Draw the robot's maximum sensing distance as a circle
ctx.strokeStyle="#FFA500";
ctx.beginPath();
ctx.arc(centerPoint[0], centerPoint[1], ((robotWidth / 2) + maxSenseDistance), 0, 2*Math.PI);
ctx.stroke();
xhr.onreadystatechange = processJSON;
xhr.open("GET", "data.json?" + new Date().getTime(), true);
xhr.send();
//var robotData = JSON.parse(text);
}
function processJSON()
{
if (xhr.readyState == 4)
{
var robotData = JSON.parse(xhr.responseText);
drawVectors(robotData);
drawPoints(robotData);
}
}
// Calculate the turn radius of the robot from the two velocities
function calculateTurnRadius(leftVector, rightVector)
{
var slope = ((rightVector leftVector) / 10.0) / (2.0 * wheelRadius);
var yOffset = ((Math.max(leftVector, rightVector) + Math.min(leftVector, rightVector)) / 10.0) / 2.0;
var xOffset = 0;
if (slope != 0)
{
xOffset = Math.round((yOffset) / slope);
}
return Math.abs(xOffset);
}
// Calculate the angle required to display a turn vector with
// a length that matched the magnitude of the motion
function calculateTurnLength(turnRadius, vectorMagnitude)
{
var circumference = 2.0 * Math.PI * turnRadius;
return Math.abs((vectorMagnitude / circumference) * (2 * Math.PI));
}
function drawVectors(robotData)
{
leftVector = robotData.velocities[0];
rightVector = robotData.velocities[1];
// Calculate the magnitude of the velocity vector in pixels
// The 2.5 dividend was determined arbitrarily to provide an
// easy to read line
var vectorMagnitude = (((leftVector + rightVector) / 2.0) / 2.5);
ctx.strokeStyle="#FF00FF";
ctx.beginPath();
if (leftVector == rightVector)
{
var vectorEndY = centerPoint[1] vectorMagnitude;
ctx.moveTo(centerPoint[0], centerPoint[1]);
ctx.lineTo(centerPoint[0], vectorEndY);
}
else
{
var turnRadius = calculateTurnRadius(leftVector, rightVector);
if (turnRadius == 0)
{
var outsideRadius = turnVectorDistance + (robotWidth / 2.0);
var rotationMagnitude = (((Math.abs(leftVector) + Math.abs(rightVector)) / 2.0) / 2.5);
var turnAngle = calculateTurnLength(outsideRadius, rotationMagnitude);
if (leftVector < rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI) turnAngle, (1.5 * Math.PI));
}
if (leftVector > rightVector)
{
ctx.arc(centerPoint[0], centerPoint[1], outsideRadius, (1.5 * Math.PI), (1.5 * Math.PI) + turnAngle);
}
}
else
{
var turnAngle = 0;
if (vectorMagnitude != 0)
{
turnAngle = calculateTurnLength(turnRadius, vectorMagnitude);
}
// Turning forward and left
if ((leftVector < rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, turnAngle, 0);
}
// Turning backwards and left
else if ((leftVector > rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, 0, turnAngle);
}
// Turning forwards and right
else if ((leftVector > rightVector) && (leftVector + rightVector > 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI, Math.PI + turnAngle);
}
// Turning backwards and right
else if ((leftVector < rightVector) && (leftVector + rightVector < 0))
{
turnVectorCenterX = centerPoint[0] + turnRadius;
turnVectorCenterY = centerPoint[1];
ctx.arc(turnVectorCenterX, turnVectorCenterY, turnRadius, Math.PI turnAngle, Math.PI);
}
}
}
ctx.stroke();
}
function drawPoints(robotData)
{
for (var key in robotData.points)
{
var angle = Number(key);
var rDistance = robotData.points[angle];
var cosValue = Math.cos(angle * (Math.PI / 180.0));
var sinValue = Math.sin(angle * (Math.PI / 180.0));
var xOffset = ((robotWidth / 2) + rDistance) * cosValue;
var yOffset = ((robotWidth / 2) + rDistance) * sinValue;
var xDist = centerPoint[0] + Math.round(xOffset);
var yDist = centerPoint[1] Math.round(yOffset);
ctx.fillStyle = "#FF0000";
ctx.fillRect((xDist 1), (yDist 1), 2, 2);
}
}
});
function changeNavigationType()
{
var navSelect = document.getElementById("Navigation Type");
console.log(navSelect.value);
var http = new XMLHttpRequest();
var url = "http://ee-kelliott:8000/&quot;;
var params = "navType=" + navSelect.value;
http.open("GET", url, true);
http.onreadystatechange = function()
{
//Call a function when the state changes.
if(http.readyState == 4 && http.status == 200)
{
alert(http.responseText);
}
}
http.send(params);
}
</script>
</head>
<body>
<div>
<canvas id="displayCanvas" width="830" height="450"></canvas>
</div>
<select id="Navigation Type" onchange="changeNavigationType()">
<option value="None">None</option>
<option value="Random">Random</option>
<option value="Grid">Grid</option>
<option value="Manual">Manual</option>
</select>
</body>
</html>

view raw

index.html

hosted with ❤ by GitHub

 

The visualization web page uses an HTML canvas and JavaScript to display the data presented in the JSON file.  The velocity and direction of the robot is shown by a curved vector, the length representing the speed of the robot and the curvature representing the radius of the robot’s turn.  The obstacles seen by the five ultrasonic sensors are represented on the canvas by five points.

DataImage

The picture above is the resulting web page without any JSON data.

Here’s a video of the robot running alongside a video of the data that’s displayed.  As you can see, the data visualization isn’t exactly real time.  Due to the communication delays over the network some of the robot’s decisions and sensor readings get lost.

Further Steps

Having the main structure of the data visualization set up provides a powerful interface to the robot that can be used to easily add more functionality in the future.  I’d like to add more data items as I add more sensors to the robot such as the floor color sensor, an accelerometer, a compass, etc.  At some point I also plan on creating a way to pass data to the robot from the web page, providing an interface to control the robot remotely.

I’d also like to improve the existing data visualization interface to minimize the amount of data lost due to network and processing delays.  One way I could do this would be by getting rid of the JSON file and streaming the data directly to the web page, getting rid of unnecessary file IO delays.  The Arduino also prints out the ultrasonic sensor data one sensor at a time, requiring the serial server to read five lines from the serial port to get the reading of all the sensors.  Compressing all sensor data to a single line would also help improve the speed.

Another useful feature would be to record the sensor data rather than just having it be visible from the web page.  The ability to store all of the written JSON files and replay the run from the robot’s perspective would make reviewing the data multiple times possible.

OpenADR: Much Ado About Batteries

This post is going to be less of a project update, and more of a stream of consciousness monologue about one of the aspects of OpenADR that’s given me a lot of trouble.  Power electronics isn’t my strong suit and the power requirements for the robot are giving me pause.  Additionally, in my last post I wrote about some of the difficulties I encountered when trying to draw more than a few hundred milliamps from a USB battery pack.  Listed below are the ideal features for the Nav unit’s power supply and the reason behind each.  I will also be comparing separate options for batteries and power systems to see which system best fits the requirements.

Requirements

  • Rechargeable – This one should be obvious, all of the current robot vacuums on the market use rechargeable batteries.  It’d be too inconvenient and expensive for the user to have to constantly replace batteries.
  • In-circuit charging – This just means that the batteries can be charged inside of the robot without having to take it out and plug it in.  The big robot vacuum models like Neato and Roomba both automatically find their way back to a docking station and start the charging process themselves.  While auto-docking isn’t high on my list of priorities, I’d still like to make it possible to do in the future.  It would also be much more convenient for the user to not have to manually take the battery out of the robot to charge it.
  • Light weight – The lighter the robot is, the less power it needs to move.  Similarly, high energy density, or the amount of power a battery provides for its weight, is also important.
  • Multiple voltage rails – As stated in my last post I’m powering both the robot motors and logic boards from the same 5V power source.  This is part of the reason I had so many issues with the motors causing the Raspberry Pi to reset.  More robust systems usually separate the motors and logic so that they use separate power supplies, thereby preventing electrical noise, caused by the motors, from affecting the digital logic.  Due to the high power electronics that will be necessary on OpenADR modules, like the squirrel cage fan I’ll be using for the vacuum, I’m also planning on having a 12V power supply that will be connected between the Nav unit and any modules.  Therefore my current design will require three power supplies in total; a 5V supply for the control boards and sensors, a 6V supply for the motors, and a 12V supply for the module interface.  Each of these different voltage rails can be generated from a single battery by using DC-DC converters, but each one will add complexity and cost.
  • High Power – With a guestimated 1A at 5V for the logic and sensors, 0.5A at 6V for the drive motors, and 2A at 12V for the module power, the battery is going to need to be capable of supply upwards of 30W of power to the robot.
  • Convenient – The battery used for the robot needs to be easy to use.  OpenADR should be the ultimate convenience system, taking away the need to perform tedious tasks.  This convenience should also apply to the battery.  Even if the battery isn’t capable of being charged in-circuit, it should at least be easily accessible and easy to charge.
  • Price – OpenADR is meant to be an affordable alternative to the expensive domestic robots on the market today.  As such, the battery and battery charging system should be affordable.
  • Availability – In a perfect world, every component of the robot would be easily available from eBay, Sparkfun, or Adafruit.  It would also be nice if the battery management and charging system used existing parts without needing to design a custom solution.
  • Safety – Most importantly the robot needs to be safe!  With the hoverboard fiasco firmly in mind, the battery system needs to be safe with no chance of fires or other unsavory behaviors.  This is also relates to the availability requirement.  An already available, tried and true system would be preferable to an untested, custom one.

Metrics

The rechargeability and safety requirements, as well as the need for three separate power supplies, are really non-negotiable factors that I can’t compromise on.  I also have no control over whether in-circuit charging is feasible, how convenient a system is, how heavy a system is, and the availability of parts, so while I’ll provide a brief discussion of each factor, they will not be the main factors I use for comparison.  I will instead focus on power, or rather the efficiency of the power delivery to the three power supplies, and price.

Battery Chemistry

There were three types of battery chemistry that I considered for my comparison.  The first is LiPo or Li-Ion batteries which currently have the highest energy density in the battery market, making them a good candidate for a light weight robot.  The latest Neato and Roomba robots also use Li-Ion batteries.  The big drawback for these is safety.  As demonstrated by the hoverboard fires, if they’re not properly handled or charged they can be explosive.  This almost completely rules out the option to do a custom charging solution in my mind.  Luckily, there are plenty of options for single cell LiPo/Li-Ion chargers available from both SparkFun and Adafruit.

Second is LiFePO4 batteries.  While not as popular as LiPos and Li-Ions due to their lower energy density, they’re much safer.  A battery can even be punctured without catching fire.  Other than that, they’re very similar to LiPo/Li-Ion batteries.

Lastly is NiMH batteries.  They were the industry standard for most robots for a while, and are used in all but the latest Roomba and Neato models.  They have recently fallen out of favor due to a lower energy density than both types of lithium batteries.  I haven’t included any NiMH systems in my comparisons because they don’t provide any significant advantages over the other two chemistries.

Systems

  1. 1S Li-Ion System1S Li-Ion – A single cell Li-Ion would probably be the easiest option as far as the battery goes.  Single cell Li-Ion batteries with protection circuitry are used extensively in the hobby community.  Because of this they’re easy to obtain with high capacity cells and simple charging electronics available.  This would make in-circuit charging possible.  The trade-off for simplicity of the battery is complexity in the DC-DC conversion circuits.  A single cell Li-Ion only has a cell voltage of 3.7V, making it necessary to convert the voltage for all three power supplies.  Because the lower voltage also means lower power batteries, several would need to be paralleled to achieve the same amount of power as the other battery configurations.  Simple boost converters could supply power to both the logic and motor power supplies.  The 12V rail would require several step-up converters to supply the requisite amount of current at the higher voltage.  Luckily these modules are cheap and easy to find on eBay.
  2. 3S LiPo System3S LiPo – Another option would be using a 3 cell LiPo.  These batteries are widely used for quadcopters and other hobby RC vehicles, making them easy to find.  Also, because a three cell LiPo results in a battery voltage of 11.1V, no voltage conversion would be necessary when supplying the 12V power supply.  Only two step-down regulators would be needed, supplying power to the logic and motor power supplies.  These regulators are also widely available on eBay and are just as cheap as the step-up regulators  The downside is that, as I’ve mentioned before, LiPos are inherently temperamental and can be dangerous.  I also had trouble finding high quality charging circuitry for multi-cell batteries that could be used to charge the battery in-circuit, meaning the user would have to remove the battery for charging.
  3. 4S LiFePO4 System4S LiFePO4 – Lastly is the four cell LiFePO4 battery system.  It has all the same advantages as the three cell LiPo configuration, but with the added safety of the LiFePO4 chemistry.  Also, because the four cells result in a battery voltage of 12.8V-13.2V, it would be possible to put a diode on the positive battery terminal, adding the ability to safely add several batteries in parallel, and still stay above the desired 12V module power supply voltage.  LiFePO4 are also easier to charge and don’t have the same exploding issues when charging as LiPo batteries, so it would be possible to design custom charging circuitry to enable in-circuit charging for the robot.  The only downside, as far as LiFePO4 batteries go, is with availability.  Because this chemistry isn’t as widely used as LiPos and Li-Ions there are less options when it comes to finding batteries to use.

Power Efficiency Comparison

To further examine the three options I listed above, I compared the power delivery systems of each and roughly calculated their efficiency.  Below are the Google Sheets calculations.  I assumed the nominal voltage of the batteries in my calculations and that the total power capacity of each battery was the same.  From there I guesstimated the power required by the robot, the current draw on the batteries, and the power efficiency of the whole system.  I also used this power efficiency to calculate the runtime of the robot lost due to the inefficiency.  The efficiencies I used for the DC-DC converters were estimated using the data and voltage-efficiency curves listed in the datasheets.

Due to the high currents necessary for the single cell Li-Ion system, I assumed that there would be an always-on P-Channel MOSFET after each battery, effectively acting as a diode with a lower voltage drop and allowing multiple batteries to be added together in parallel.  I also assumed a diode would be placed after the 12V boost regulators, due to the fact that multiple would be needed to supply the desired 1.5A.

 

 

Here I assumed that a diode would be connected between the battery and 12V bus power, allowing for parallelization of batteries and bringing the voltage closer to the desired 12V.

Conclusion

Looking at the facts and power efficiencies I listed previously, the 4S LiFePO4 battery is looking like the most attractive option.  While its efficiency would be slightly lower than the 3S LiPo, I think the added safety and possibility for in-circuit charging makes it worth it.  While I’m not sure if OpenADR will ultimately end up using LiFePO4 batteries, that’s the path I’m going to explore for now.  Of course, power and batteries aren’t really in my wheelhouse so comments and suggestions are welcome.

OpenADR: Wall Following

After several different attempts at wall following algorithms and a lot of tweaking, I’ve finally settled on a basic algorithm that mostly works and allows the OpenADR navigation unit to roughly follow a wall.  The test run is below:

Test Run

 

Algorithm

I used a very basic subsumption architecture as the basis for my wall following algorithm.  This just means that the robot has a list of behaviors that in performs.  Each behavior is triggered by a condition with the conditions being prioritized.  Using my own algorithm as an example, the triggers and behaviors of the robot are listed below:

  • A wall closer than 10cm in front triggers the robot to turn left until the wall is on the right.
  • If the front-right sensor is closer to the wall than the right sensor, the robot is angled towards the wall and so turns left.
  • If the robot is closer than the desired distance from the wall, it turns left slightly to move further away.
  • If the robot is further than the desired distance from the wall, it turns right slightly to move closer.
  • Otherwise it just travels straight.

The robot goes through each of these conditions sequentially, and if a certain condition is met the robot performs the triggered action and then skips the rest and doesn’t bother checking the rest of the conditions.  As displayed in the test run video this method mostly works but certainly has room for improvement.

Source


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
#define TARGET_DISTANCE 6
#define TOLERANCE 2
// Mapping of All of the distance sensor angles
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
// Array of distance sensor objects
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
uint16_t Distances_Previous[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
// Initialize all distances to 0
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = 0;
Distances_Previous[i] = 0;
}
}
void loop()
{
updateSensors();
// If there's a wall ahead
if (Distances[DEG_90] < 10)
{
uint8_t minDir;
Serial.println("Case 1");
// Reverse slightly
motors.backward(FULL_SPEED);
delay(100);
// Turn left until the wall is on the right
do
{
updateSensors();
minDir = getClosestWall();
motors.turnLeft(FULL_SPEED);
delay(100);
}
while ((Distances[DEG_90] < 10) && (minDir != DEG_0));
}
// If the front right sensor is closer to the wall than the right sensor, the robot is angled toward the wall
else if ((Distances[DEG_45] <= Distances[DEG_0]) && (Distances[DEG_0] < (TARGET_DISTANCE + TOLERANCE)))
{
Serial.println("Case 2");
// Turn left to straighten out
motors.turnLeft(FULL_SPEED);
delay(100);
}
// If the robot is too close to the wall and isn't getting farther
else if ((checkWallTolerance(Distances[DEG_0]) == –1) && (Distances[DEG_0] <= Distances_Previous[DEG_0]))
{
Serial.println("Case 3");
motors.turnLeft(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// If the robot is too far from the wall and isn't getting closer
else if ((checkWallTolerance(Distances[DEG_0]) == 1) && (Distances[DEG_0] >= Distances_Previous[DEG_0]))
{
Serial.println("Case 4");
motors.turnRight(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// Otherwise keep going straight
else
{
motors.forward(FULL_SPEED);
delay(100);
}
}
// A function to retrieve the distance from all sensors
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances_Previous[i] = Distances[i];
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
delay(1);
}
}
// Retrieve the angle of the closest wall
uint8_t getClosestWall()
{
uint8_t tempMin = 255;
uint16_t tempDist = 500;
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
if (min(tempDist, Distances[i]) == Distances[i])
{
tempDist = Distances[i];
tempMin = i;
}
}
return tempMin;
}
// Check if the robot is within the desired distance from the wall
int8_t checkWallTolerance(uint16_t measurement)
{
if (measurement < (TARGET_DISTANCE – TOLERANCE))
{
return1;
}
else if (measurement > (TARGET_DISTANCE + TOLERANCE))
{
return 1;
}
else
{
return 0;
}
}

Conclusion

There are still plenty of problems that I need to tackle for the robot to be able to successfully navigate my apartment.  I tested it in a very controlled environment and the algorithm I’m currently using isn’t robust enough to handle oddly shaped rooms or obstacles.  It also still tends to get stuck butting against the wall and completely ignores exterior corners.

Some of the obstacle and navigation problems will hopefully be remedied by adding bump sensors to the robot.  The sonar coverage of the area around the robot is sparser than I originally thought and the 2cm blindspot around the robot is causing some problems.  The more advanced navigation will also be helped by improving my algorithm to build a map of the room in the robot’s memory in addition to adding encoders for more precise position tracking.

OpenADR: Test Platform

The second round of the Hackaday Prize ends tomorrow so I’ve spent the weekend hard at work on the OpenADR test platform.  I’ve been doing quite a bit of soldering and software testing to streamline development and to determine what the next steps I need to take are.  This blog post will outline the hardware changes and software testing that I’ve done for the test platform.

Test Hardware

While my second motion test used an Arduino Leonardo for control, I realized I needed more GPIO if I wanted to hook up all the necessary sensors and peripherals.  I ended up buying a knockoff Arduino Mega 2560 and have started using that.

2016-05-28 19.00.22

2016-05-28 19.00.07I also bought a proto shield to make the peripheral connections pseudo-permanent.

2016-05-29 11.47.13Hardwiring the L9110 motor module and the five HC-SR04 sensors allows for easy hookup to the test platform.

HC-SR04 Library

The embedded code below comprises the HC-SR04 Library for the ultrasonic distance sensors.  The trigger and echo pins are passed into the constructor and the getDistance() function triggers the ultrasonic pulse and then measures the time it takes to receive the echo pulse.  This measurement is then converted to centimeters or inches and the resulting value is returned.


#ifndef HCSR04_H
#define HCSR04_H
#include "Arduino.h"
#define CM 1
#define INCH 0
class HCSR04
{
public:
HCSR04(uint8_t triggerPin, uint8_t echoPin);
HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout);
uint32_t timing();
uint16_t getDistance(uint8_t units, uint8_t samples);
private:
uint8_t _triggerPin;
uint8_t _echoPin;
uint32_t _timeout;
};
#endif

view raw

HCSR04.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "HCSR04.h"
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = 24000;
}
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = timeout;
}
uint32_t HCSR04::timing()
{
uint32_t duration;
digitalWrite(_triggerPin, LOW);
delayMicroseconds(2);
digitalWrite(_triggerPin, HIGH);
delayMicroseconds(10);
digitalWrite(_triggerPin, LOW);
duration = pulseIn(_echoPin, HIGH, _timeout);
if (duration == 0)
{
duration = _timeout;
}
return duration;
}
uint16_t HCSR04::getDistance(uint8_t units, uint8_t samples)
{
uint32_t duration = 0;
uint16_t distance;
for (uint8_t i = 0; i < samples; i++)
{
duration += timing();
}
duration /= samples;
if (units == CM)
{
distance = duration / 29 / 2 ;
}
else if (units == INCH)
{
distance = duration / 74 / 2;
}
return distance;
}

view raw

HCSR04.cpp

hosted with ❤ by GitHub

L9110 Library

This library controls the L9110 dual h-bridge module.  The four controlling pins are passed in.  These pins also need to be PWM compatible as the library uses the analogWrite() function to control the speed of the motors.  The control of the motors is broken out into forward(), backward(), turnLeft(), and turnRight() functions whose operation should be obvious.  These four simple motion types will work fine for now but I will need to add finer control once I get into more advanced motion types and control loops.


#ifndef L9110_H
#define L9110_H
#include "Arduino.h"
class L9110
{
public:
L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB);
void forward(uint8_t speed);
void backward(uint8_t speed);
void turnLeft(uint8_t speed);
void turnRight(uint8_t speed);
private:
uint8_t _A_IA;
uint8_t _A_IB;
uint8_t _B_IA;
uint8_t _B_IB;
void motorAForward(uint8_t speed);
void motorABackward(uint8_t speed);
void motorBForward(uint8_t speed);
void motorBBackward(uint8_t speed);
};
#endif

view raw

L9110.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "L9110.h"
L9110::L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB)
{
_A_IA = A_IA;
_A_IB = A_IB;
_B_IA = B_IA;
_B_IB = B_IB;
pinMode(_A_IA, OUTPUT);
pinMode(_A_IB, OUTPUT);
pinMode(_B_IA, OUTPUT);
pinMode(_B_IB, OUTPUT);
}
void L9110::forward(uint8_t speed)
{
motorAForward(speed);
motorBForward(speed);
}
void L9110::backward(uint8_t speed)
{
motorABackward(speed);
motorBBackward(speed);
}
void L9110::turnLeft(uint8_t speed)
{
motorABackward(speed);
motorBForward(speed);
}
void L9110::turnRight(uint8_t speed)
{
motorAForward(speed);
motorBBackward(speed);
}
void L9110::motorAForward(uint8_t speed)
{
digitalWrite(_A_IA, LOW);
analogWrite(_A_IB, speed);
}
void L9110::motorABackward(uint8_t speed)
{
digitalWrite(_A_IB, LOW);
analogWrite(_A_IA, speed);
}
void L9110::motorBForward(uint8_t speed)
{
digitalWrite(_B_IA, LOW);
analogWrite(_B_IB, speed);
}
void L9110::motorBBackward(uint8_t speed)
{
digitalWrite(_B_IB, LOW);
analogWrite(_B_IA, speed);
}

view raw

L9110.cpp

hosted with ❤ by GitHub

Test Code

Lastly is the test code used to check the functionality of the libraries.  It simple tests all the sensors, prints the output, and runs through the four types of motion to verify that everything is working.


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
pinMode(29, OUTPUT);
pinMode(28, OUTPUT);
}
void loop()
{
updateSensors();
motors.forward(FULL_SPEED);
delay(1000);
motors.backward(FULL_SPEED);
delay(1000);
motors.turnLeft(FULL_SPEED);
delay(1000);
motors.turnRight(FULL_SPEED);
delay(1000);
motors.forward(0);
delay(6000);
}
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
}
}

view raw

basicTest.ino

hosted with ❤ by GitHub

As always, all the code I’m using is open source and available on the OpenADR GitHub repository.  I’ll be using these libraries and the electronics I assembled to start testing some motion algorithms!

OpenADR: Navigation Chassis v0.1

2016-01-15 21.07.52.jpg

After tons of design work and hours of printing, the prototype for the navigation chassis is done!  This is by no means meant to be a final version, but rather will serve as a prototyping platform to start work on the electronics and software for the robot.  Pictures and explanations are below!

Components

Design

Due to print bed limitations I divided the navigation unit of the robot into three main sections, two motor sections and a front section.  Each of these sections was then split in half for printing.  There are mounting holes on each part that allow everything to be assembled using M3 nuts and bolts.

In terms of sizing I’ve left a 150mm square at the center of the robot as well as the back quarter of the circle as free space.  These should provide sufficient space for any extensions I design.

Motor Assembly

2016-01-15 21.06.19

The motor assembly is the most complex part of the design, consisting of five separate parts.  For the motors I decided to use the most common motor found on eBay, a generic yellow robot motor.  They tend to be very cheap and easy to use.  They’re attached to the motor mount using M3 nuts and bolts.

2016-01-15 21.06.52

While these motors usually come with wheels, I found them to be too large and cumbersome.  Smaller wheels were necessary to conserve space so I designed simple, 3D printed ones using a spoked hub and TPU filament for the tires to provide traction.

2016-01-15 21.06.36

I couldn’t find any cheap encoders that I was happy with on eBay, so I decided to design my own using magnets and hall effect sensors.  The magnets are generic 1/4 in. and should be easy to find.  My reasoning behind using magnetic encoders instead of optical is because magnetic encoders tend to be more robust when it comes to dirty environments.  I’ll go into detail about the hall effect sensor PCB I designed when I do a write-up for the electronics.

Ultrasonic Rangefinders

2016-01-15 21.07.16.jpg

As seen in the top image, I have five ultrasonic rangefinders that will be used for localization and mapping.  They’re mounted on either side, in the front, and 45 degrees to the left and right of the front.  This will provide the robot with a full view of obstacles in front of the robot.

Color Sensor

2016-01-15 21.07.12.jpg

I’m still waiting for this part in the mail, but I’ve designed space for a TCS3200 color sensor module.  This will be used for determining what type of floor the vacuum is on.  This color sensor uses internal photodiodes to measure the amount of reflected red, green, blue, and white light.  I’m hoping that I’ll be able to use the white light magnitudinal component as a primitive proximity sensor so the robot can detect stairs.

Casters

2016-01-15 21.07.30.jpg

Rather than using metal ball casters like I was originally planning, I decided on designing 3D printed rollers instead.  These are the same type of casters used on the Neato robots.

Bumpers

While I have yet to design in the bumpers for the robot, I plan on using microswitches attached to two 3D printed front bumpers to detect obstacles, on bumper on the front left and one on the front right.

Improvements

There are a handful of things that I’d like to tweak to improve upon this design.  The current version of the navigation unit only has 3.5mm of ground clearance.  I’ll be playing with this a lot in order to strike a balance so that the cleaning module is low enough to optimally clean the floor, yet high enough so the chassis doesn’t sag and drag on the ground.

While I’m currently using five ultrasonic sensors, I’m unsure as to whether that many is needed, or if the mapping will be fine with only three.  I’d like to remove any unnecessary components to cut costs.

There are a few other difficulties I noticed when assembling the navigation unit.  Mounting the wheel on the motor proved a little difficult due to the size of the wheel well.  Since I have some extra space I’ll probably increase the well size to make mounting the wheel easier.  The same goes for the casters.  Because the axle can’t be 3D printed with the chassis design, I have to find a way to mount it on the front that allows it to be put in easily but doesn’t allow it to be knocked out during use.

As always my design files are available on the Github repository.  Thoughts, comments, and criticism are always welcome.  I’ll be doing a lot of tweaks on the design and wiring up the electronics in the near future.  I hope to do a full electronics write-up soon.