OpenADR: Connecting to Raspberry Pi

2016-06-11 23.25.45

The most frustrating part of developing the wall following algorithm from my last post was the constant moving back and forth between my office, to tweak and load firmware, and test area (kitchen hallway).  To solve this problem, and to overall streamline development, I decided to add a Raspberry Pi to the robot.  Specifically, I’m using a Raspberry Pi 2 that I had lying around, but expect to switch to a Pi Zero once I get code the code and design in a more final state.  By installing the Arduino IDE and enabling SSH, I’m now able to access and edit code wirelessly.

Having a full-blown Linux computer on the robot also adds plenty of opportunity for new features.  Currently I’m planning on adding camera support via the official camera module and a web server to serve a web page for manual control, settings configuration, and data visualization.

While I expected hooking up the Raspberry Pi as easy as connecting to the Arduino over USB, this turned out to not be the case.  Having a barebones SBC revealed a few problems with my wiring and code.

The first issue I noticed was the Arduino resetting when the motors were running, but this was easily attributable to the current limit on the Raspberry Pi USB ports.  A USB 2.0 port, like those on the Pi, can only supply up to 500mA of current.  Motors similar to the ones I’m using are specced at 250mA each, so having both motors accelerating suddenly to full speed caused a massive voltage drop which reset the Arduino.  This was easily fixed by connecting the motor supply of the motor controller to the 5V output on the Raspberry Pi GPIO header.  Additionally, setting the max_usb_current flag in /boot/config.txt allows up to 2A on the 5V line, minus what the Pi uses.  2A should be more than sufficient once the motors, Arduino, and other sensors are hooked up.

The next issue I encountered was much more nefarious.  With the motors hooked up directly to the 5V on the Pi, changing from full-speed forward to full-speed backward caused everything to reset!  I don’t have an oscilloscope to confirm this, but my suspicion was that there was so much noise placed on the power supply by the motors that both boards were resetting.  This is where one of the differences between the Raspberry Pi and a full computer is most evident.  On a regular PC there’s plenty of space to add robust power circuitry and filters to prevent noise on the 5V USB lines, but because space on the Pi is at a premium the minimal filtering on the 5V bus wasn’t sufficient to remove the noise caused by the motors.  When I originally wrote the motor controller library I didn’t have motor noise in mind and instead just set the speed of the motor instantly.  In the case of both motors switching from full-speed forward to full-speed backward, the sudden reversal causes a huge spike in the power supply, as explained in this app note.  I was able to eventually fix this by rewriting the library to include some acceleration and deceleration.  By limiting the acceleration on the motors, the noise was reduced enough that the boards no longer reset.

While setting up the Raspberry Pi took longer than I’d hoped due to power supply problems, I’m glad I got a chance to learn the limits on my design and will keep bypass capacitors in mind when I design a permanent board.  I’m also excited for the possibilities having a Raspberry Pi on board provides and look forward to adding more advanced features to OpenADR!

OpenADR: Wall Following

After several different attempts at wall following algorithms and a lot of tweaking, I’ve finally settled on a basic algorithm that mostly works and allows the OpenADR navigation unit to roughly follow a wall.  The test run is below:

Test Run

 

Algorithm

I used a very basic subsumption architecture as the basis for my wall following algorithm.  This just means that the robot has a list of behaviors that in performs.  Each behavior is triggered by a condition with the conditions being prioritized.  Using my own algorithm as an example, the triggers and behaviors of the robot are listed below:

  • A wall closer than 10cm in front triggers the robot to turn left until the wall is on the right.
  • If the front-right sensor is closer to the wall than the right sensor, the robot is angled towards the wall and so turns left.
  • If the robot is closer than the desired distance from the wall, it turns left slightly to move further away.
  • If the robot is further than the desired distance from the wall, it turns right slightly to move closer.
  • Otherwise it just travels straight.

The robot goes through each of these conditions sequentially, and if a certain condition is met the robot performs the triggered action and then skips the rest and doesn’t bother checking the rest of the conditions.  As displayed in the test run video this method mostly works but certainly has room for improvement.

Source


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
#define TARGET_DISTANCE 6
#define TOLERANCE 2
// Mapping of All of the distance sensor angles
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
// Array of distance sensor objects
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
uint16_t Distances_Previous[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
// Initialize all distances to 0
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = 0;
Distances_Previous[i] = 0;
}
}
void loop()
{
updateSensors();
// If there's a wall ahead
if (Distances[DEG_90] < 10)
{
uint8_t minDir;
Serial.println("Case 1");
// Reverse slightly
motors.backward(FULL_SPEED);
delay(100);
// Turn left until the wall is on the right
do
{
updateSensors();
minDir = getClosestWall();
motors.turnLeft(FULL_SPEED);
delay(100);
}
while ((Distances[DEG_90] < 10) && (minDir != DEG_0));
}
// If the front right sensor is closer to the wall than the right sensor, the robot is angled toward the wall
else if ((Distances[DEG_45] <= Distances[DEG_0]) && (Distances[DEG_0] < (TARGET_DISTANCE + TOLERANCE)))
{
Serial.println("Case 2");
// Turn left to straighten out
motors.turnLeft(FULL_SPEED);
delay(100);
}
// If the robot is too close to the wall and isn't getting farther
else if ((checkWallTolerance(Distances[DEG_0]) == –1) && (Distances[DEG_0] <= Distances_Previous[DEG_0]))
{
Serial.println("Case 3");
motors.turnLeft(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// If the robot is too far from the wall and isn't getting closer
else if ((checkWallTolerance(Distances[DEG_0]) == 1) && (Distances[DEG_0] >= Distances_Previous[DEG_0]))
{
Serial.println("Case 4");
motors.turnRight(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// Otherwise keep going straight
else
{
motors.forward(FULL_SPEED);
delay(100);
}
}
// A function to retrieve the distance from all sensors
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances_Previous[i] = Distances[i];
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
delay(1);
}
}
// Retrieve the angle of the closest wall
uint8_t getClosestWall()
{
uint8_t tempMin = 255;
uint16_t tempDist = 500;
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
if (min(tempDist, Distances[i]) == Distances[i])
{
tempDist = Distances[i];
tempMin = i;
}
}
return tempMin;
}
// Check if the robot is within the desired distance from the wall
int8_t checkWallTolerance(uint16_t measurement)
{
if (measurement < (TARGET_DISTANCE – TOLERANCE))
{
return1;
}
else if (measurement > (TARGET_DISTANCE + TOLERANCE))
{
return 1;
}
else
{
return 0;
}
}

Conclusion

There are still plenty of problems that I need to tackle for the robot to be able to successfully navigate my apartment.  I tested it in a very controlled environment and the algorithm I’m currently using isn’t robust enough to handle oddly shaped rooms or obstacles.  It also still tends to get stuck butting against the wall and completely ignores exterior corners.

Some of the obstacle and navigation problems will hopefully be remedied by adding bump sensors to the robot.  The sonar coverage of the area around the robot is sparser than I originally thought and the 2cm blindspot around the robot is causing some problems.  The more advanced navigation will also be helped by improving my algorithm to build a map of the room in the robot’s memory in addition to adding encoders for more precise position tracking.

OpenADR: Test Platform

The second round of the Hackaday Prize ends tomorrow so I’ve spent the weekend hard at work on the OpenADR test platform.  I’ve been doing quite a bit of soldering and software testing to streamline development and to determine what the next steps I need to take are.  This blog post will outline the hardware changes and software testing that I’ve done for the test platform.

Test Hardware

While my second motion test used an Arduino Leonardo for control, I realized I needed more GPIO if I wanted to hook up all the necessary sensors and peripherals.  I ended up buying a knockoff Arduino Mega 2560 and have started using that.

2016-05-28 19.00.22

2016-05-28 19.00.07I also bought a proto shield to make the peripheral connections pseudo-permanent.

2016-05-29 11.47.13Hardwiring the L9110 motor module and the five HC-SR04 sensors allows for easy hookup to the test platform.

HC-SR04 Library

The embedded code below comprises the HC-SR04 Library for the ultrasonic distance sensors.  The trigger and echo pins are passed into the constructor and the getDistance() function triggers the ultrasonic pulse and then measures the time it takes to receive the echo pulse.  This measurement is then converted to centimeters or inches and the resulting value is returned.


#ifndef HCSR04_H
#define HCSR04_H
#include "Arduino.h"
#define CM 1
#define INCH 0
class HCSR04
{
public:
HCSR04(uint8_t triggerPin, uint8_t echoPin);
HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout);
uint32_t timing();
uint16_t getDistance(uint8_t units, uint8_t samples);
private:
uint8_t _triggerPin;
uint8_t _echoPin;
uint32_t _timeout;
};
#endif

view raw

HCSR04.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "HCSR04.h"
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = 24000;
}
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = timeout;
}
uint32_t HCSR04::timing()
{
uint32_t duration;
digitalWrite(_triggerPin, LOW);
delayMicroseconds(2);
digitalWrite(_triggerPin, HIGH);
delayMicroseconds(10);
digitalWrite(_triggerPin, LOW);
duration = pulseIn(_echoPin, HIGH, _timeout);
if (duration == 0)
{
duration = _timeout;
}
return duration;
}
uint16_t HCSR04::getDistance(uint8_t units, uint8_t samples)
{
uint32_t duration = 0;
uint16_t distance;
for (uint8_t i = 0; i < samples; i++)
{
duration += timing();
}
duration /= samples;
if (units == CM)
{
distance = duration / 29 / 2 ;
}
else if (units == INCH)
{
distance = duration / 74 / 2;
}
return distance;
}

view raw

HCSR04.cpp

hosted with ❤ by GitHub

L9110 Library

This library controls the L9110 dual h-bridge module.  The four controlling pins are passed in.  These pins also need to be PWM compatible as the library uses the analogWrite() function to control the speed of the motors.  The control of the motors is broken out into forward(), backward(), turnLeft(), and turnRight() functions whose operation should be obvious.  These four simple motion types will work fine for now but I will need to add finer control once I get into more advanced motion types and control loops.


#ifndef L9110_H
#define L9110_H
#include "Arduino.h"
class L9110
{
public:
L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB);
void forward(uint8_t speed);
void backward(uint8_t speed);
void turnLeft(uint8_t speed);
void turnRight(uint8_t speed);
private:
uint8_t _A_IA;
uint8_t _A_IB;
uint8_t _B_IA;
uint8_t _B_IB;
void motorAForward(uint8_t speed);
void motorABackward(uint8_t speed);
void motorBForward(uint8_t speed);
void motorBBackward(uint8_t speed);
};
#endif

view raw

L9110.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "L9110.h"
L9110::L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB)
{
_A_IA = A_IA;
_A_IB = A_IB;
_B_IA = B_IA;
_B_IB = B_IB;
pinMode(_A_IA, OUTPUT);
pinMode(_A_IB, OUTPUT);
pinMode(_B_IA, OUTPUT);
pinMode(_B_IB, OUTPUT);
}
void L9110::forward(uint8_t speed)
{
motorAForward(speed);
motorBForward(speed);
}
void L9110::backward(uint8_t speed)
{
motorABackward(speed);
motorBBackward(speed);
}
void L9110::turnLeft(uint8_t speed)
{
motorABackward(speed);
motorBForward(speed);
}
void L9110::turnRight(uint8_t speed)
{
motorAForward(speed);
motorBBackward(speed);
}
void L9110::motorAForward(uint8_t speed)
{
digitalWrite(_A_IA, LOW);
analogWrite(_A_IB, speed);
}
void L9110::motorABackward(uint8_t speed)
{
digitalWrite(_A_IB, LOW);
analogWrite(_A_IA, speed);
}
void L9110::motorBForward(uint8_t speed)
{
digitalWrite(_B_IA, LOW);
analogWrite(_B_IB, speed);
}
void L9110::motorBBackward(uint8_t speed)
{
digitalWrite(_B_IB, LOW);
analogWrite(_B_IA, speed);
}

view raw

L9110.cpp

hosted with ❤ by GitHub

Test Code

Lastly is the test code used to check the functionality of the libraries.  It simple tests all the sensors, prints the output, and runs through the four types of motion to verify that everything is working.


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
pinMode(29, OUTPUT);
pinMode(28, OUTPUT);
}
void loop()
{
updateSensors();
motors.forward(FULL_SPEED);
delay(1000);
motors.backward(FULL_SPEED);
delay(1000);
motors.turnLeft(FULL_SPEED);
delay(1000);
motors.turnRight(FULL_SPEED);
delay(1000);
motors.forward(0);
delay(6000);
}
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
}
}

view raw

basicTest.ino

hosted with ❤ by GitHub

As always, all the code I’m using is open source and available on the OpenADR GitHub repository.  I’ll be using these libraries and the electronics I assembled to start testing some motion algorithms!

OpenADR: Motion Test #2

I just finished testing v0.2 of the chassis and results are great!  The robot runs pretty well on hardwood/tile but it still has a little trouble on carpet.  I think I may need to further increase the ground clearance but for now it’ll work well enough to start writing code!

2016-05-18 19.54.18To demo the robot’s motion I wired up an Arduino Leonardo to an L9110 motor driver to control the motors and an HC-SR04 to perform basic obstacle detection.  I’m using a backup phone battery for power.  The demo video is below:

Over the next couple of days I plan on wiring up the rest of the ultrasonic sensors and doing some more advanced motion control.  Once that’s finished I’ll post more detailed explanations of the electronics and the source code!

OpenADR: Navigation Chassis v0.2

Based on the results of the first motion test, I made several tweaks to the chassis for version 0.2.

IMG_0476

  • As I stated in the motion test, ground clearance was a big issue that caused problems on carpet.  I don’t know the general measurements for carpet thickness, but the medium-pile carpet in my apartment was tall enough that the chassis was lifted up and the wheels couldn’t get a good purchase on the ground.  This prevented the robot from moving at all.  I’m hoping 7.5mm will be enough ground clearance, but I’ll find out once I do another motion test.

IMG_0478.JPG

  • I added a side wall to the robot to increase stiffness.  With the flexibility of PLA, the middle of the robot sagged and as a result the center dragged on the ground.  Adding an extra dimension to the robot in the form of a wall helps with stiffness and prevents most sagging.  My first attempt at this was in the form of an integrated side wall, allowing the entire thing to be printed in one piece.  This, however, turned out to be a bad idea.  With the wall in the way it was extremely difficult to get tools into the chassis to add in the vitamins (e.g. motors, casters, etc.).  DFM is important!  So instead of integrated walls I went for a more modular approach.

IMG_0480.JPG

IMG_0479.JPG

  • The base is much like it was before, flat and simple, but I’ve added holes for attaching the side walls as separate pieces.  The side walls then have right angle bolt hole connected so the wall can be bolted on to the base after everything has been assembled.

IMG_0483.JPG

  • One thing I don’t like about this method is that all of the bolt heads that will be sticking out of the bottom of the chassis.  I’d like to go back at a later time and figure out how to recess the bolts so the bolt heads don’t poke though.
  • With the stiffness added by the side wall I decided to decrease the thickness of the chassis base and wall from 2.5mm to 2mm.  This results in saved plastic and print time.
  • Adding the side wall had the downside of interfering with the sonar sensor.  I’m not too familiar with the beam patterns of the HC-SR04 module, but I didn’t want to risk the wall causing signal loss or false positives, so I moved the distance sensors outward so the tip of the ultrasonic transducers are flush with the side wall.  Unfortunately, due to the minimum detectable distance of the sensor, this means that there will be a 2cm blindspot all around the robot that will have to be dealt with.  I’m unsure of how I’ll deal with this at the moment, but I’ll most likely end up using IR proximity sensors or bumper switches to detect immediate obstacles.
  • Other than the above changes, I mostly just tweaked the sizing and placement of certain items to optimize assembly and part fits.

This is still very much a work in progress and there are still several things I’d like to add before I start working on the actual vacuum module, but I’m happy with the progress so far on the navigation chassis and think it’ll be complete enough to work as a test platform for navigation and mapping.

OpenADR: Motion Test #1

This is just a quick update on the OpenADR project.  Last night I hooked up the motor driver and ran a quick test with the motors hardwired to drive forward.  I wanted to verify that the motors were strong enough to move the robot and also see if there were any necessary tweaks that needed to be applied to the mechanics.

Looking at the result, there are a few things I want to change.  Most notably, the center of the chassis sags a lot.  The center was dragging on the ground during this test and it wouldn’t even run on carpet.  This is mostly due to the fact that the test chassis has no sidewalls and PLA isn’t very stiff, so there’s a great deal of flex that allows the center to touch the ground.  This can be fixed by adding some small sidewalls in the next test chassis revision.

Additionally, I’m not happy with the ground clearance.  This chassis has about 3.5mm of ground clearance which isn’t quite enough to let the wheels rest firmly on the ground in medium pile carpet.  In light of this, I’ll be increasing the ground clearance experimentally, with my next test having 7.5mm of clearance.

OpenADR: Navigation Chassis v0.1

2016-01-15 21.07.52.jpg

After tons of design work and hours of printing, the prototype for the navigation chassis is done!  This is by no means meant to be a final version, but rather will serve as a prototyping platform to start work on the electronics and software for the robot.  Pictures and explanations are below!

Components

Design

Due to print bed limitations I divided the navigation unit of the robot into three main sections, two motor sections and a front section.  Each of these sections was then split in half for printing.  There are mounting holes on each part that allow everything to be assembled using M3 nuts and bolts.

In terms of sizing I’ve left a 150mm square at the center of the robot as well as the back quarter of the circle as free space.  These should provide sufficient space for any extensions I design.

Motor Assembly

2016-01-15 21.06.19

The motor assembly is the most complex part of the design, consisting of five separate parts.  For the motors I decided to use the most common motor found on eBay, a generic yellow robot motor.  They tend to be very cheap and easy to use.  They’re attached to the motor mount using M3 nuts and bolts.

2016-01-15 21.06.52

While these motors usually come with wheels, I found them to be too large and cumbersome.  Smaller wheels were necessary to conserve space so I designed simple, 3D printed ones using a spoked hub and TPU filament for the tires to provide traction.

2016-01-15 21.06.36

I couldn’t find any cheap encoders that I was happy with on eBay, so I decided to design my own using magnets and hall effect sensors.  The magnets are generic 1/4 in. and should be easy to find.  My reasoning behind using magnetic encoders instead of optical is because magnetic encoders tend to be more robust when it comes to dirty environments.  I’ll go into detail about the hall effect sensor PCB I designed when I do a write-up for the electronics.

Ultrasonic Rangefinders

2016-01-15 21.07.16.jpg

As seen in the top image, I have five ultrasonic rangefinders that will be used for localization and mapping.  They’re mounted on either side, in the front, and 45 degrees to the left and right of the front.  This will provide the robot with a full view of obstacles in front of the robot.

Color Sensor

2016-01-15 21.07.12.jpg

I’m still waiting for this part in the mail, but I’ve designed space for a TCS3200 color sensor module.  This will be used for determining what type of floor the vacuum is on.  This color sensor uses internal photodiodes to measure the amount of reflected red, green, blue, and white light.  I’m hoping that I’ll be able to use the white light magnitudinal component as a primitive proximity sensor so the robot can detect stairs.

Casters

2016-01-15 21.07.30.jpg

Rather than using metal ball casters like I was originally planning, I decided on designing 3D printed rollers instead.  These are the same type of casters used on the Neato robots.

Bumpers

While I have yet to design in the bumpers for the robot, I plan on using microswitches attached to two 3D printed front bumpers to detect obstacles, on bumper on the front left and one on the front right.

Improvements

There are a handful of things that I’d like to tweak to improve upon this design.  The current version of the navigation unit only has 3.5mm of ground clearance.  I’ll be playing with this a lot in order to strike a balance so that the cleaning module is low enough to optimally clean the floor, yet high enough so the chassis doesn’t sag and drag on the ground.

While I’m currently using five ultrasonic sensors, I’m unsure as to whether that many is needed, or if the mapping will be fine with only three.  I’d like to remove any unnecessary components to cut costs.

There are a few other difficulties I noticed when assembling the navigation unit.  Mounting the wheel on the motor proved a little difficult due to the size of the wheel well.  Since I have some extra space I’ll probably increase the well size to make mounting the wheel easier.  The same goes for the casters.  Because the axle can’t be 3D printed with the chassis design, I have to find a way to mount it on the front that allows it to be put in easily but doesn’t allow it to be knocked out during use.

As always my design files are available on the Github repository.  Thoughts, comments, and criticism are always welcome.  I’ll be doing a lot of tweaks on the design and wiring up the electronics in the near future.  I hope to do a full electronics write-up soon.

OpenADR: On Modularity

I realized that in my last blog post I stated how modularity was important to the design of OpenADR, but I never actually defined my vision for what modularity means and what it would entail.  Thus far I’ve identified this project as a robot vacuum.  My ultimate goal, however lofty it may be, is to design a system of robots and extensions rather than just an autonomous vacuum.  There are home robots of all kinds on the market today such as vacuums, mops, and gutter-cleaners.  They’re always complicated and always expensive.  The thing is, I don’t think they need to be.

As an engineer I look at the design of these robots and see unnecessary complexity.  As a programmer I see redundancy and chances to optimize.  As a Maker I see an opportunity for the community to make something better.  Those thoughts were my inspiration, and the reason I decided to create this Open Source project.  I wanted to make something simple and elegant with a low barrier to entry so that anyone can contribute.

To aid in this endeavor I looked at the current domestic robot market and saw the redundancy in the designs.  A robot vacuum, robot mop, as well as other types of robot can really be broken up into two different parts.  There’s the navigation unit (e.g., the localization sensors, drive motors, encoders) and the cleaning system (e.g. vacuum, mop, waxer).  Rather than buying a vacuum for each separate task and effectively paying twice for the navigation hardware I’ve decided to first design a navigation robot and leave a large slot in robot body to allow for various swappable extensions.  The first, proof of concept extension will be the vacuum.  Doing this will reduce the number of duplicated components and cut costs of the overall project.  It will also save time on hardware and software design.

Another benefit of this modular system is additional ease of development.  Large software projects like the Linux kernel having componentized designs, breaking the software into parts based on architecture, use case, abstraction level, etc.  By having these functionally separate components, new contributors don’t need to understand the entire project just to add to a small part.  The barrier of entry to the project is reduced and project members can focus only on the components they’re most comfortable with.

This is my ultimate goal with OpenADR.  I’m most comfortable in the realms of embedded design and firmware development.  I hope that by providing a strong base for the project, people who are much more proficient than I in other areas will join in to help OpenADR move forward.

OpenADR: Project Goals

Rather than diving head first into this project without a plan, I’ve taken the time to outline some goals that will help guide the design of the robot vacuum’s first revision.  They’re listed below, categorized by their relative importance, along with a brief explanation of each one.

Primary

Vacuum with reasonable strength:

A robot vacuum should obviously be capable of vacuuming.  Reasonable strength is pretty relative, but my intention is that it’ll be strong enough to contend with a regular robotic vacuum.  CNET has an interesting test that they use in their robot vacuum reviews, as shown in this Neato Botvac review,  that I’d like to replicate at some point with my final product.

Obstacle detection:

The vacuum should be able to detect when it hits an obstacle.  A front bumper that can distinguish between the left and right side seems to be the industry standard, so I’ll probably do the same thing.

Works on various floor types:

Fairly simple, I’d like to be able to run the vacuum on different types of floor, such as tile, carpet, and hardwood.  Luckily all three are represented in my apartment so testing this should be easy.  Some tweaking of wheel size and floor clearance will probably be necessary to find a good balance between all types of floors.

1/2 hour runtime:

This is really just a baseline.  I’d like to be able to run the vacuum much longer than this, but a half hour seems like a good goal for a prototype.

Modular:

I think modularity is important for multiple reasons.  It’ll allow for an upgrade path by designing in the ability to upgrade vacuums without changing the navigation module, and vice-versa.  It also makes the system easier to develop by allowing design of individual modules, rather than modifying a monolithic robot.

Easy to empty:

Since I have full control over the mechanics and I’m already going to make the system modular, having a simple connection mechanism between the vacuum and navigation modules shouldn’t be difficult.

Secondary

Cleans corners:

Due to its round shape, vacuums like the Roomba aren’t able to clean corners as well as squarish vacuums like the Neato.  The upside to round vacuums is that they aren’t as likely to get stuck in corners.  The square front makes navigation and turning in corners more complex, so I will be using the round design for now.  The use of a side brush might help mitigate the corner-cleaning problem.

Obstacle avoidance:

Rather than simply detecting obstacles when they’ve been hit, this would entail some sort of rangefinding to map out the locations of obstacles so they can be avoided.

Intelligent movement algorithm:

Rather than simply moving about randomly, a more complicated movement algorithm would enable the vacuum to move about the room in a pattern maximize the amount of floor vacuumed.

WiFi connectivity:

Wireless connectivity would greatly aid in the ability to debug and control the robot.  With the ESP8266 being so cheap, WiFi is the obvious choice.

Tertiary

Room mapping:

Using a variety of sensors, room mapping would allow the robot to create an internal representation of the room it’s currently cleaning.  It could then use this map to refine its movement algorithm over time.

HEPA filter:

For starters I’ll just be using a simple mesh, but somewhere down the line I’d like to find an easily available and cheap HEPA filter that can be integrated with the vacuum for better filtering.

Adapts to floor type:

The suction requirements between hard floors and carpets are very different.  Adding sensors to detect the type of floor the vacuum is on would allow the robot to throttle down the fan speed when on hardwood or tile, saving power and increasing battery life.

Ultrathin:

Most robot vacuums are between 7cm and 10cm in height.  After doing some preliminary measurements around my apartment, I found that a few pieces of furniture had gaps around 5cm underneath of them.  To handle these situations I’d like to eventually design the robot to be below that height.

App/Web controlled:

With the Internet of Things gaining steam, control apps are being released for all sorts of appliances.  The newest Roomba and Neato robot vacuums both have phone apps allowing scheduling, statistic, and other features to be controlled remotely.  Creating a phone or web app would allow greater control of the robot and provide advanced debugging capabilities.

Wireless firmware updating:

Being able to wirelessly program and update the robot would be a huge step forward for usability and development.  The ability to push out OTA updates via the app would also allow for a more diverse group of beta testers by removing the need for manual connection and driver knowledge from the equation.

While I’m sure this list will continue to grow and evolve as the project progresses, I think there’s plenty here to work off of for the basic design.  If you have any suggestions or think I missed something, feel free to leave comments below!

Open Autonomous Domestic Robots

Web

I’m the kind of person who’ll go to great lengths to avoid boring tasks, despite the fact that the work spent avoiding them greatly outweighs the work of the tasks themselves.  One of the most tedious chores I can think of is vacuuming my apartment; therefore I’m willing to put in a lot of work engineering a system to do the job for me.  While it would be far simpler to buy a Roomba, it doesn’t seem like it should cost hundreds of dollars for what amounts to a vacuum strapped onto some motors.

Instead, I’m going to try and accomplish something of similar function with cheap hardware.  Additionally, since the current domestic robots on the market require the purchase of separate robots for each target function (vacuuming, mopping, etc.), I’m going to try and design a modular system to both cut costs and reduce the redundancy between different robots.

While my initial goal is just to get the robot to vacuum, I hope that a modular design will lend itself to greater flexibility in the future.  Somewhere down the line I’d like to have swappable attachments that mop, increase bin capacity, add battery life, etc.  I also think that a modular approach will assist with development by allowing incremental updates for each component.

I anticipate this being a complex project requiring custom hardware, firmware, and software, so I’ve added this as a new project on my projects page.  I’ll attempt to split up the work into logical chunks and do individual write-ups for each milestone.

The code for the project is hosted on my Github under OpenADR.  Because a vacuum is an appliance that every home (hopefully) has, I’m sharing my work, designs, and code as an open source project in the hopes that other people will have comments, suggestions, or be interested in helping out.