OpenADR: Long Term Plans

ProductHierarchyWith the beginning of the Automation round beginning today, I decided to sketch out some of the long term plans I have for OpenADR.  All my updates so far have referenced it as a robot vacuum, with a navigation module and vacuum module that have to be connected together.

The way I see it, though, the navigation module will be the core focus of the platform with the modules being relatively dumb plug-ins that conform to a standard interface.  This makes it easy for anyone to design a simple module.  It’s also better from a cost perspective, as most of the cost will go towards the complex navigation module and the simple plug-ins can be cheap.  The navigation module will also do all of the power conversion and will supply several power rails to be used by the connected modules.

The modules that I’d like to design for the Hackaday Prize, if I have time, are the vacuum, mop, and wipe.  The vacuum module would provide the same functionality as a Roomba or Neato, the mop would be somewhere between a Scooba and Braava Jet, and the wipe would just be a reusable microfiber pad that would pick up dust and spills.

At some point I’d also like to expand OpenADR to have outdoor, domestic robots as well.  It would involve designing a new, bigger, more robust, and higher power navigation unit to handle the tougher requirements of yard work.  From what I can tell the current robotic mowers are sorely lacking, so that would be the primary focus, but I’d eventually like to expand to leaf collection and snow blowing/shoveling modules due to the lack of current offerings in both of those spaces.

Due to limited time and resources the indoor robotics for OpenADR will be my focus for the foreseeable future, but I’m thinking ahead and have a lot of plans in mind!

Advertisements

OpenADR: Data Visualization

As the navigational algorithms used by the robot have gotten more complex, I’ve noticed several occasions where I don’t understand the logical paths the robot is taking.  Since the design is only going to get more complex from here on out, I decided to implement real time data visualization to allow me to see what the robot “sees” and view what actions it’s taking in response.  With the Raspberry Pi now on board, it’s incredibly simple to implement a web page that hosts this information in an easy to use format.  The code required to enable data visualization is broken up into four parts; the firmware part to send the data, the serial server to read the data from the Arduino, the web server to host the data visualization web page, and the actual web page that displays the data.

Firmware

The firmware level was the simplest part.  I only added print statements to the code when the robot changes speed or reads the ultrasonic sensors.  It prints out key value pairs over the USB serial port to the Raspberry Pi.

Serial Server

The Raspberry Pi then runs a serial server which processes these key value pairs and converts them into a dictionary representing the robot’s left motor and right motor speeds and the obstacles viewed by each ultrasonic sensor.  This dictionary is then converted into the JSON format and written out to a file.

This is an example of what the resulting JSON data looks like.

Web Server

Using a Python HTTP server example, the Raspberry Pi runs a web server which hosts the generated JSON file and the web page that’s used for viewing the data.

Web Page

 

The visualization web page uses an HTML canvas and JavaScript to display the data presented in the JSON file.  The velocity and direction of the robot is shown by a curved vector, the length representing the speed of the robot and the curvature representing the radius of the robot’s turn.  The obstacles seen by the five ultrasonic sensors are represented on the canvas by five points.

DataImage

The picture above is the resulting web page without any JSON data.

Here’s a video of the robot running alongside a video of the data that’s displayed.  As you can see, the data visualization isn’t exactly real time.  Due to the communication delays over the network some of the robot’s decisions and sensor readings get lost.

Further Steps

Having the main structure of the data visualization set up provides a powerful interface to the robot that can be used to easily add more functionality in the future.  I’d like to add more data items as I add more sensors to the robot such as the floor color sensor, an accelerometer, a compass, etc.  At some point I also plan on creating a way to pass data to the robot from the web page, providing an interface to control the robot remotely.

I’d also like to improve the existing data visualization interface to minimize the amount of data lost due to network and processing delays.  One way I could do this would be by getting rid of the JSON file and streaming the data directly to the web page, getting rid of unnecessary file IO delays.  The Arduino also prints out the ultrasonic sensor data one sensor at a time, requiring the serial server to read five lines from the serial port to get the reading of all the sensors.  Compressing all sensor data to a single line would also help improve the speed.

Another useful feature would be to record the sensor data rather than just having it be visible from the web page.  The ability to store all of the written JSON files and replay the run from the robot’s perspective would make reviewing the data multiple times possible.

OpenADR: Much Ado About Batteries

This post is going to be less of a project update, and more of a stream of consciousness monologue about one of the aspects of OpenADR that’s given me a lot of trouble.  Power electronics isn’t my strong suit and the power requirements for the robot are giving me pause.  Additionally, in my last post I wrote about some of the difficulties I encountered when trying to draw more than a few hundred milliamps from a USB battery pack.  Listed below are the ideal features for the Nav unit’s power supply and the reason behind each.  I will also be comparing separate options for batteries and power systems to see which system best fits the requirements.

Requirements

  • Rechargeable – This one should be obvious, all of the current robot vacuums on the market use rechargeable batteries.  It’d be too inconvenient and expensive for the user to have to constantly replace batteries.
  • In-circuit charging – This just means that the batteries can be charged inside of the robot without having to take it out and plug it in.  The big robot vacuum models like Neato and Roomba both automatically find their way back to a docking station and start the charging process themselves.  While auto-docking isn’t high on my list of priorities, I’d still like to make it possible to do in the future.  It would also be much more convenient for the user to not have to manually take the battery out of the robot to charge it.
  • Light weight – The lighter the robot is, the less power it needs to move.  Similarly, high energy density, or the amount of power a battery provides for its weight, is also important.
  • Multiple voltage rails – As stated in my last post I’m powering both the robot motors and logic boards from the same 5V power source.  This is part of the reason I had so many issues with the motors causing the Raspberry Pi to reset.  More robust systems usually separate the motors and logic so that they use separate power supplies, thereby preventing electrical noise, caused by the motors, from affecting the digital logic.  Due to the high power electronics that will be necessary on OpenADR modules, like the squirrel cage fan I’ll be using for the vacuum, I’m also planning on having a 12V power supply that will be connected between the Nav unit and any modules.  Therefore my current design will require three power supplies in total; a 5V supply for the control boards and sensors, a 6V supply for the motors, and a 12V supply for the module interface.  Each of these different voltage rails can be generated from a single battery by using DC-DC converters, but each one will add complexity and cost.
  • High Power – With a guestimated 1A at 5V for the logic and sensors, 0.5A at 6V for the drive motors, and 2A at 12V for the module power, the battery is going to need to be capable of supply upwards of 30W of power to the robot.
  • Convenient – The battery used for the robot needs to be easy to use.  OpenADR should be the ultimate convenience system, taking away the need to perform tedious tasks.  This convenience should also apply to the battery.  Even if the battery isn’t capable of being charged in-circuit, it should at least be easily accessible and easy to charge.
  • Price – OpenADR is meant to be an affordable alternative to the expensive domestic robots on the market today.  As such, the battery and battery charging system should be affordable.
  • Availability – In a perfect world, every component of the robot would be easily available from eBay, Sparkfun, or Adafruit.  It would also be nice if the battery management and charging system used existing parts without needing to design a custom solution.
  • Safety – Most importantly the robot needs to be safe!  With the hoverboard fiasco firmly in mind, the battery system needs to be safe with no chance of fires or other unsavory behaviors.  This is also relates to the availability requirement.  An already available, tried and true system would be preferable to an untested, custom one.

Metrics

The rechargeability and safety requirements, as well as the need for three separate power supplies, are really non-negotiable factors that I can’t compromise on.  I also have no control over whether in-circuit charging is feasible, how convenient a system is, how heavy a system is, and the availability of parts, so while I’ll provide a brief discussion of each factor, they will not be the main factors I use for comparison.  I will instead focus on power, or rather the efficiency of the power delivery to the three power supplies, and price.

Battery Chemistry

There were three types of battery chemistry that I considered for my comparison.  The first is LiPo or Li-Ion batteries which currently have the highest energy density in the battery market, making them a good candidate for a light weight robot.  The latest Neato and Roomba robots also use Li-Ion batteries.  The big drawback for these is safety.  As demonstrated by the hoverboard fires, if they’re not properly handled or charged they can be explosive.  This almost completely rules out the option to do a custom charging solution in my mind.  Luckily, there are plenty of options for single cell LiPo/Li-Ion chargers available from both SparkFun and Adafruit.

Second is LiFePO4 batteries.  While not as popular as LiPos and Li-Ions due to their lower energy density, they’re much safer.  A battery can even be punctured without catching fire.  Other than that, they’re very similar to LiPo/Li-Ion batteries.

Lastly is NiMH batteries.  They were the industry standard for most robots for a while, and are used in all but the latest Roomba and Neato models.  They have recently fallen out of favor due to a lower energy density than both types of lithium batteries.  I haven’t included any NiMH systems in my comparisons because they don’t provide any significant advantages over the other two chemistries.

Systems

  1. 1S Li-Ion System1S Li-Ion – A single cell Li-Ion would probably be the easiest option as far as the battery goes.  Single cell Li-Ion batteries with protection circuitry are used extensively in the hobby community.  Because of this they’re easy to obtain with high capacity cells and simple charging electronics available.  This would make in-circuit charging possible.  The trade-off for simplicity of the battery is complexity in the DC-DC conversion circuits.  A single cell Li-Ion only has a cell voltage of 3.7V, making it necessary to convert the voltage for all three power supplies.  Because the lower voltage also means lower power batteries, several would need to be paralleled to achieve the same amount of power as the other battery configurations.  Simple boost converters could supply power to both the logic and motor power supplies.  The 12V rail would require several step-up converters to supply the requisite amount of current at the higher voltage.  Luckily these modules are cheap and easy to find on eBay.
  2. 3S LiPo System3S LiPo – Another option would be using a 3 cell LiPo.  These batteries are widely used for quadcopters and other hobby RC vehicles, making them easy to find.  Also, because a three cell LiPo results in a battery voltage of 11.1V, no voltage conversion would be necessary when supplying the 12V power supply.  Only two step-down regulators would be needed, supplying power to the logic and motor power supplies.  These regulators are also widely available on eBay and are just as cheap as the step-up regulators  The downside is that, as I’ve mentioned before, LiPos are inherently temperamental and can be dangerous.  I also had trouble finding high quality charging circuitry for multi-cell batteries that could be used to charge the battery in-circuit, meaning the user would have to remove the battery for charging.
  3. 4S LiFePO4 System4S LiFePO4 – Lastly is the four cell LiFePO4 battery system.  It has all the same advantages as the three cell LiPo configuration, but with the added safety of the LiFePO4 chemistry.  Also, because the four cells result in a battery voltage of 12.8V-13.2V, it would be possible to put a diode on the positive battery terminal, adding the ability to safely add several batteries in parallel, and still stay above the desired 12V module power supply voltage.  LiFePO4 are also easier to charge and don’t have the same exploding issues when charging as LiPo batteries, so it would be possible to design custom charging circuitry to enable in-circuit charging for the robot.  The only downside, as far as LiFePO4 batteries go, is with availability.  Because this chemistry isn’t as widely used as LiPos and Li-Ions there are less options when it comes to finding batteries to use.

Power Efficiency Comparison

To further examine the three options I listed above, I compared the power delivery systems of each and roughly calculated their efficiency.  Below are the Google Sheets calculations.  I assumed the nominal voltage of the batteries in my calculations and that the total power capacity of each battery was the same.  From there I guesstimated the power required by the robot, the current draw on the batteries, and the power efficiency of the whole system.  I also used this power efficiency to calculate the runtime of the robot lost due to the inefficiency.  The efficiencies I used for the DC-DC converters were estimated using the data and voltage-efficiency curves listed in the datasheets.

Due to the high currents necessary for the single cell Li-Ion system, I assumed that there would be an always-on P-Channel MOSFET after each battery, effectively acting as a diode with a lower voltage drop and allowing multiple batteries to be added together in parallel.  I also assumed a diode would be placed after the 12V boost regulators, due to the fact that multiple would be needed to supply the desired 1.5A.

 

 

Here I assumed that a diode would be connected between the battery and 12V bus power, allowing for parallelization of batteries and bringing the voltage closer to the desired 12V.

Conclusion

Looking at the facts and power efficiencies I listed previously, the 4S LiFePO4 battery is looking like the most attractive option.  While its efficiency would be slightly lower than the 3S LiPo, I think the added safety and possibility for in-circuit charging makes it worth it.  While I’m not sure if OpenADR will ultimately end up using LiFePO4 batteries, that’s the path I’m going to explore for now.  Of course, power and batteries aren’t really in my wheelhouse so comments and suggestions are welcome.

OpenADR: Connecting to Raspberry Pi

2016-06-11 23.25.45

The most frustrating part of developing the wall following algorithm from my last post was the constant moving back and forth between my office, to tweak and load firmware, and test area (kitchen hallway).  To solve this problem, and to overall streamline development, I decided to add a Raspberry Pi to the robot.  Specifically, I’m using a Raspberry Pi 2 that I had lying around, but expect to switch to a Pi Zero once I get code the code and design in a more final state.  By installing the Arduino IDE and enabling SSH, I’m now able to access and edit code wirelessly.

Having a full-blown Linux computer on the robot also adds plenty of opportunity for new features.  Currently I’m planning on adding camera support via the official camera module and a web server to serve a web page for manual control, settings configuration, and data visualization.

While I expected hooking up the Raspberry Pi as easy as connecting to the Arduino over USB, this turned out to not be the case.  Having a barebones SBC revealed a few problems with my wiring and code.

The first issue I noticed was the Arduino resetting when the motors were running, but this was easily attributable to the current limit on the Raspberry Pi USB ports.  A USB 2.0 port, like those on the Pi, can only supply up to 500mA of current.  Motors similar to the ones I’m using are specced at 250mA each, so having both motors accelerating suddenly to full speed caused a massive voltage drop which reset the Arduino.  This was easily fixed by connecting the motor supply of the motor controller to the 5V output on the Raspberry Pi GPIO header.  Additionally, setting the max_usb_current flag in /boot/config.txt allows up to 2A on the 5V line, minus what the Pi uses.  2A should be more than sufficient once the motors, Arduino, and other sensors are hooked up.

The next issue I encountered was much more nefarious.  With the motors hooked up directly to the 5V on the Pi, changing from full-speed forward to full-speed backward caused everything to reset!  I don’t have an oscilloscope to confirm this, but my suspicion was that there was so much noise placed on the power supply by the motors that both boards were resetting.  This is where one of the differences between the Raspberry Pi and a full computer is most evident.  On a regular PC there’s plenty of space to add robust power circuitry and filters to prevent noise on the 5V USB lines, but because space on the Pi is at a premium the minimal filtering on the 5V bus wasn’t sufficient to remove the noise caused by the motors.  When I originally wrote the motor controller library I didn’t have motor noise in mind and instead just set the speed of the motor instantly.  In the case of both motors switching from full-speed forward to full-speed backward, the sudden reversal causes a huge spike in the power supply, as explained in this app note.  I was able to eventually fix this by rewriting the library to include some acceleration and deceleration.  By limiting the acceleration on the motors, the noise was reduced enough that the boards no longer reset.

While setting up the Raspberry Pi took longer than I’d hoped due to power supply problems, I’m glad I got a chance to learn the limits on my design and will keep bypass capacitors in mind when I design a permanent board.  I’m also excited for the possibilities having a Raspberry Pi on board provides and look forward to adding more advanced features to OpenADR!

OpenADR: Wall Following

After several different attempts at wall following algorithms and a lot of tweaking, I’ve finally settled on a basic algorithm that mostly works and allows the OpenADR navigation unit to roughly follow a wall.  The test run is below:

Test Run

 

Algorithm

I used a very basic subsumption architecture as the basis for my wall following algorithm.  This just means that the robot has a list of behaviors that in performs.  Each behavior is triggered by a condition with the conditions being prioritized.  Using my own algorithm as an example, the triggers and behaviors of the robot are listed below:

  • A wall closer than 10cm in front triggers the robot to turn left until the wall is on the right.
  • If the front-right sensor is closer to the wall than the right sensor, the robot is angled towards the wall and so turns left.
  • If the robot is closer than the desired distance from the wall, it turns left slightly to move further away.
  • If the robot is further than the desired distance from the wall, it turns right slightly to move closer.
  • Otherwise it just travels straight.

The robot goes through each of these conditions sequentially, and if a certain condition is met the robot performs the triggered action and then skips the rest and doesn’t bother checking the rest of the conditions.  As displayed in the test run video this method mostly works but certainly has room for improvement.

Source

Conclusion

There are still plenty of problems that I need to tackle for the robot to be able to successfully navigate my apartment.  I tested it in a very controlled environment and the algorithm I’m currently using isn’t robust enough to handle oddly shaped rooms or obstacles.  It also still tends to get stuck butting against the wall and completely ignores exterior corners.

Some of the obstacle and navigation problems will hopefully be remedied by adding bump sensors to the robot.  The sonar coverage of the area around the robot is sparser than I originally thought and the 2cm blindspot around the robot is causing some problems.  The more advanced navigation will also be helped by improving my algorithm to build a map of the room in the robot’s memory in addition to adding encoders for more precise position tracking.

OpenADR: Test Platform

The second round of the Hackaday Prize ends tomorrow so I’ve spent the weekend hard at work on the OpenADR test platform.  I’ve been doing quite a bit of soldering and software testing to streamline development and to determine what the next steps I need to take are.  This blog post will outline the hardware changes and software testing that I’ve done for the test platform.

Test Hardware

While my second motion test used an Arduino Leonardo for control, I realized I needed more GPIO if I wanted to hook up all the necessary sensors and peripherals.  I ended up buying a knockoff Arduino Mega 2560 and have started using that.

2016-05-28 19.00.22

2016-05-28 19.00.07I also bought a proto shield to make the peripheral connections pseudo-permanent.

2016-05-29 11.47.13Hardwiring the L9110 motor module and the five HC-SR04 sensors allows for easy hookup to the test platform.

HC-SR04 Library

The embedded code below comprises the HC-SR04 Library for the ultrasonic distance sensors.  The trigger and echo pins are passed into the constructor and the getDistance() function triggers the ultrasonic pulse and then measures the time it takes to receive the echo pulse.  This measurement is then converted to centimeters or inches and the resulting value is returned.

L9110 Library

This library controls the L9110 dual h-bridge module.  The four controlling pins are passed in.  These pins also need to be PWM compatible as the library uses the analogWrite() function to control the speed of the motors.  The control of the motors is broken out into forward(), backward(), turnLeft(), and turnRight() functions whose operation should be obvious.  These four simple motion types will work fine for now but I will need to add finer control once I get into more advanced motion types and control loops.

Test Code

Lastly is the test code used to check the functionality of the libraries.  It simple tests all the sensors, prints the output, and runs through the four types of motion to verify that everything is working.

As always, all the code I’m using is open source and available on the OpenADR GitHub repository.  I’ll be using these libraries and the electronics I assembled to start testing some motion algorithms!

OpenADR: Motion Test #2

I just finished testing v0.2 of the chassis and results are great!  The robot runs pretty well on hardwood/tile but it still has a little trouble on carpet.  I think I may need to further increase the ground clearance but for now it’ll work well enough to start writing code!

2016-05-18 19.54.18To demo the robot’s motion I wired up an Arduino Leonardo to an L9110 motor driver to control the motors and an HC-SR04 to perform basic obstacle detection.  I’m using a backup phone battery for power.  The demo video is below:

Over the next couple of days I plan on wiring up the rest of the ultrasonic sensors and doing some more advanced motion control.  Once that’s finished I’ll post more detailed explanations of the electronics and the source code!