OpenADR: Test Platform

The second round of the Hackaday Prize ends tomorrow so I’ve spent the weekend hard at work on the OpenADR test platform.  I’ve been doing quite a bit of soldering and software testing to streamline development and to determine what the next steps I need to take are.  This blog post will outline the hardware changes and software testing that I’ve done for the test platform.

Test Hardware

While my second motion test used an Arduino Leonardo for control, I realized I needed more GPIO if I wanted to hook up all the necessary sensors and peripherals.  I ended up buying a knockoff Arduino Mega 2560 and have started using that.

2016-05-28 19.00.22

2016-05-28 19.00.07I also bought a proto shield to make the peripheral connections pseudo-permanent.

2016-05-29 11.47.13Hardwiring the L9110 motor module and the five HC-SR04 sensors allows for easy hookup to the test platform.

HC-SR04 Library

The embedded code below comprises the HC-SR04 Library for the ultrasonic distance sensors.  The trigger and echo pins are passed into the constructor and the getDistance() function triggers the ultrasonic pulse and then measures the time it takes to receive the echo pulse.  This measurement is then converted to centimeters or inches and the resulting value is returned.

L9110 Library

This library controls the L9110 dual h-bridge module.  The four controlling pins are passed in.  These pins also need to be PWM compatible as the library uses the analogWrite() function to control the speed of the motors.  The control of the motors is broken out into forward(), backward(), turnLeft(), and turnRight() functions whose operation should be obvious.  These four simple motion types will work fine for now but I will need to add finer control once I get into more advanced motion types and control loops.

Test Code

Lastly is the test code used to check the functionality of the libraries.  It simple tests all the sensors, prints the output, and runs through the four types of motion to verify that everything is working.

As always, all the code I’m using is open source and available on the OpenADR GitHub repository.  I’ll be using these libraries and the electronics I assembled to start testing some motion algorithms!

Big Seven Segment Display

Recently at work my team was throwing around the idea of real-time feedback by adding a big counter to the workspace to keep track of things like time left in the development cycle, number of static analysis warnings, etc.  For that reason I decided to build a large Seven Segment Display to provide that continuous visual feedback.  I designed the display completely from scratch to serve as a simple project that utilized a wide variety of skills.  Below I go through each step of the process and how each part works.


2015-07-24 17.30.41

For the actual segment part of the display I used transparent PLA from Hatchbox.  I didn’t have any trouble with how these turned out and am happy with the filament.  I had to do some experimentation with infill percentage in order to get the right amount of light passing through.  I believe in the end I settled on 20%.

2015-07-24 17.30.51

Here’s a close up of the semi-transparent digits.  Two 5mm LEDs fit snugly into the holes int he back.

2015-07-25 22.18.20


The individual segments are designing to press fit into black PLA frames.  I “welded” the four separate frames together using a 3Doodler to extrude plastic into the gaps.  For aesthetics I only welded the backs of the frames together, so while they’ll stay connected the seams won’t hold up to much abuse.


Segment_Schem    Segment_Board.png

To keep board size small and reduce complexity I designed individual boards for each segment.  They only consist of two 5mm through hole LEDs, a current limiting resistor, and a 0.1″ header to connect to the main board.

2016-04-03 17.59.20

Here’s a pic of the boards soldered up and hot glued into the display.



The control board is just a breakout for the TPIC6B595 which is a high current shift register that can power LEDs and be daisy chained together using SPI.

SparkFun ESP8266 Thing - Dev Board

I’m using the Sparkfun ESP8266 Thing Dev Board as the brains of the project.  This nifty little WiFi controller is Arduino-compatible and allows me to communicate with the display without needing to connect it to a USB port.


Like I mentioned above, I’m using the SPI bus to communicate to each individual LED controller and daisy chaining them all together.  All that’s required is to write out four separate 8-bit numbers to set the appropriate LEDs.

The server side of things is the most complicated part of the project.  I borrowed heavily from the Web Server example in Sparkfun’s hookup guide.  The Thing board hosts a web server which accepts GET requests with the number to set the display two.  It then parses the string version of the number and converts it to the correct 8-bit sequence to set the seven segment display.


The client side of things involved writing a Python script to submit the GET requests based on what data is being measured.  The current example is going through a directory and counting the occurrences of “if” in the code.


As always, the source code and 3D files will be up on my Github.  I’m travelling this weekend, but will upload all the files Monday night.


2016-04-08 12.43.14

While the LED display was plenty bright in my dim home office, I found out that they weren’t nearly strong enough to overcome the bright fluorescent lights at my office.  It’s difficult to see in the picture above, but it’s hard to read the numbers from more than a few feet away.  When I originally designed the display I erred on the side of caution and ran the cheap eBay LEDs at ~3mA.  Unfortunately, this was not nearly enough.  While it would be relatively easy to swap out the resistors to increase the current to the LEDs, I’m also unhappy with how much the plastic segments diffuse the light and think just making the LEDs brighter would only exacerbate the problem.

For version two I think I’ll experiment with two different ways to fix the problems above.  One option would be to use “straw hat” LEDs instead of the generic 5mm kind.  These LEDs have improved light diffusion.  I could also use hemispherical holes in the plastic digits, rather than cylindrical ones, so that the light is directed in a wider pattern.

Another option would be using WS2812B.  In conjunction with an improved plastic digit to help with light diffusion, this LED would greatly simplify the electronics of the display.  These LED modules have built in resistors and control logic, allowing them to be daisy chained and controlled via a serial interface without any need for the shift register control board that I used for version one.  WS2812Bs are also RGB LEDs, so another benefit is that the color would be controllable.

Hopefully I’ll get a chance to start on version two soon!

Overly Dramatic Compile Button

As a software engineer I’ve long been unimpressed with the triviality associated with compiling code.  Surely the building of a masterful creation involving hundreds of source files and complex algorithms deserves more than just a small keyboard shortcut.  There should be drama, there should be flair, there should be maniacal laughter!

Featured image

To fill these requirements I’ve created the Build Button!  Using the wonderfully dramatic Adafruit Massive Arcade Button, a mini Arduino Kickstarter reward, and a 3D printed enclosure,  I made a button that communicates serially to a computer through the USB port.  A Python program on the computer monitors the currently active window and tells the button to light up when the active program matches a list of programs with build shortcuts.  When the button tells the computer that it’s been pressed the computer executes the keyboard shortcut associated with the currently active window to compile the code.

The source for this project is on my GitHub.

Octopod First Test

After much soldering and programming I finally have the code base and electronics mostly setup for the Octopod.  As previously stated I’m using two Adafruit 16-channel servo controllers for handling the 24 leg servos.  I’m currently using an Arduino Pro Micro to handle the controlling of the breakouts.  I have a rudimentary Arduino library up on my Github.

I’m basically handling the control as a hierarchical set of libraries.  On the top level I have the Octopod class which will handle very abstract motion such as stepping, turning, rotating, and tilting.  The Octopod class has eight Leg objects.  The Leg class is responsible for, you guessed it, a leg!  It handles things such as forward and inverse kinematics, which is a topic for another post,  and some trajectory generation (e.g. arcs, lines, etc.).  Each leg consists of three Joints.  The Joint class handles the servos’ default positions and offsets and contains a Servo which is just the barebones implementation to control a hobbyist servo using the PCA9685.  I’ve done some basic testing on the Servo and Joint classes and am fairly confident that they work but I’ve been avoiding the Leg class since I don’t yet want to go through the nightmare of debugging my trigonometry.

Instead, I decided to create a walking program just to get a chance to see the robot in action.  However, this didn’t go nearly as well as I’d hoped and the Octopod collapsed in a jittery mess.  In my haste I decided to not include the decoupling capacitors on the Adafruit boards which turned out to be a mistake since I have 24 servos which can each draw several hundred milliamps.  Once added, the operation was a little better, though not by much.  Below is a video of the robot attempting to walk.

From the looks of things, either the capacitors were not enough or my code is incredibly messed up.  In the next couple of days I’m hoping to take a closer look at the problem.

The Deconstruction

Last weekend I had the great opportunity to participate in a 48hr hackathon.  Called The Deconstruction, it involved a 48hr marathon where teams people from any age or background got together and built a project of our choice.  Aside from some of the creepy messages we got from trolls on our stream, it was awesome!  I think setting aside everything and taking a weekend to concentrate on making something is great and really spurs creativity.  I’m just lucky school is still slow and I didn’t have much work.

So on to the project.  Ever since last summer, I’ve wanted to build an automatic drink mixing machine.  I even spent a fair bit of time planning it out and ordered some of the parts.  When I presented the idea to my teammates they all jumped on board and we decided to do that this weekend.  Our team was The Dangerous Dijkstras.  A lot of details about the construction of the project are at the team site.

Here are some videos of the final project.

Jurassic Singularity

So in my first post one of my interests that I neglected to mention was hardware hacking.  In fact, as much as I love high level programming and computer science, I’d have to say my true passion is the low level stuff.  Learning about computer architecture, designing logic circuits, and programming microcontrollers provides a lot of challenges and a great sense of accomplishment.  The best part is being able to turn lines of code into something tangible by interfacing microcontrollers with real world appliances, sensors, and actuators.  This is one of the main reasons why I find robotics so appealing.

And as a result of this passion, I’ve accumulated a fair amount of old toys with the intention of taking them apart, seeing how they work, and possibly hacking their electronics to make a cool autonomous robot.  I have a bunch of RC toys from my childhood such as cars, an RC robot (Rad 2.0: very cool!), and a small hovercraft.  In addition I’ve also managed to snag some cheap used toys off of eBay or from thrift stores.  That’s how I acquired my most recent project which will be the main focus of this post.  More specifically, one day I was frequenting one of the geekier (awesomer) parts of the internet, when I thought to myself, robots are cool, and dinosaurs are cool, I wonder if someone has combined the two before (FYI most of my interests may make me seem like an eight year old, but I’m a legal adult I swear!).  So I searched  a bit around the linked website as well as the internet but could only find one or two examples of a robotic dinosaur.  But then I stumbled upon this beauty.

This gorgeous thing is a semi-autonomous, remote controlled, robotic toy raptor.  Normally retailing for $200, I managed to snag one of these suckers off eBay for a mere $15.  The only catch was there was no remote!  So I figured this would at the very least look pretty awesome tearing around in autonomous mode but optimistically, I hoped I could build a remote using an Arduino or possibly piggyback the electronics directly and have an Arduino controlling the built in microcontroller.  Well let me tell you, it works!
In an attempt to save myself effort and money I started off by looking online to see if anyone else had had a similar idea.  Turns out Wowwee robots are notoriously simple to hack and there was a whole online community that focused on hacking the RoboSapien, which is a humanoid robot.  After some more research I found that these ‘bots use IR wireless communication.  This essential makes the robots way more hackable for a number of reasons.  First off, most RC toys use radio frequencies between the remote and the toys.  These radio frequencies are usually low power, relatively easy to make, and provide very little interference with the outside world.  The only problem is there are several commonly used frequencies which all require special chips that are not commonly available in order to transmit data.  Wowwee robots, on the other hand, use IR, or infrared, light to transmit data.  IR is relatively high power compared to radio frequencies, but usually has a lower range.  It is also a lot more susceptible to interference because there many things that produce IR radiation, such as the sun for example.  IR controllers also tend to be more directional in nature, meaning you have to be pointing the remote at or around the thing you’re trying to control.  However IR does tend to be a more popular wireless control method because it’s very cheap and readily available.  In fact Radioshack sells IR LEDs that can be used to make custom remotes.
But anyways, while I certainly wouldn’t have used IR to control a $200 toy, it does mean that the average hacker can easily recreate a remote control.  The only caveat is that in order to one to control a commercial device, you need to know what protocol to use or how the controller sends data.  So first off I needed to find out what modulation frequency the controller used.  For those who don’t know what that means, let me explain.  As a said before, IR is notoriously susceptible to interference.  Infrared is everywhere!  Essentially anything that is producing heat is producing infrared light.  TV remotes produce it, the sun produces it, we do as well.  It’s a pretty busy section of the spectrum so it’d be easy to lose signals if you’re just blinking an IR LED.  A good analogy would be if you’re swimming way out in the ocean.  You’re probably not going to notice if the water level gradually changes by ten feet.  You will notice, however, if all of the sudden the water gets really rough and there’s a lot of waves constantly lifting and dropping you.  This is what frequency modulation is.  Rather than an IR LED being simply on or off, the LED is switched on and off thousands of times a second for a specified amount of time.  The ones and zeros of a binary message can then be whether the receiver sees a modulated IR burst, how long one lasts, or how long a pause between bursts is seen.
This brings me to the Wowwee protocol.  Despite the fact that not many people have hacked the raptor, I still managed to find enough info to understand the protocol since the same command structure is used by all the Wowwee robots.  This makes my job of building a controller a breeze because, if the protocol had not already been documented I would have had to buy a controller and built an IR receiver to record the IR bursts that the controller sent.  So with this difficulty aside, I could go straight to making the remote.
Using the this website, I found that the IR signal is modulated at 39.2KHz (I rounded to 40KHz which was good enough) with a 1200Hz transmission frequency which means a 833 microsecond bit width (the time taken to send a bit).  The transmission starts with an eight bit width IR pulse, a logic high is a pause of four bit widths followed by a one bit width pulse, and a logic low is a one bit width pause followed by a one bit width pulse.  There are a total of twelve bits in each message.  Here’s the command table taken from the above website which is now out of commission.  The numbers are in hexadecimal format (base 16).

Normal Operation




Mode 1

Head Right
Head Left
Tail Left
Tail Right

As you can see from the above table there are multiple signals for each command.  On the RoboRaptor controller there is a shift button which can trigger alternate functions of the Demo, Head Right, Tail Left, Tail Right, and Bite commands.

I then proceeded to create an Arduino program to control an IR LED.  Using the AVR built in PWM I managed to modulate the frequency to 40KHz.  I then set up two NPN transistors in an AND gate format with the two inputs being the 40KHz signal and the bit logic signal which is manually timed using the delayMicroseconds function and some bit logic.  I’ve uploaded the code to Github where it’s free to use and distribute.  I’ll most likely be updating it over the next few weeks as I work more on the RoboRaptor.  I’d like to convert the entire sketch into a RoboRaptor class with built in functions for walking, trotting, and running, as well as many of the other commands.  That way in the future it’ll be easy for other people to simply import the library and use it.

As for my other future plans with the raptor (I haven’t come up with a name for him/her yet), I’ve noticed that the front speaker seems a little quiet which I’d like to try and fix.  My dream is also to have a team of three of these, with one command Raptor and two soldiers, and get them to move around and hunt like the Velociraptors in Jurassic Park.  The only problem with this is that there is no way to control two of the toys separately since they use the same command protocol (Guess the inventors never thought someone would have more than one of their $200 toys), so I might try and take apart the electronics, probe it to see how the internal microcontroller controls the RoboRaptor’s movements, and try to imitate that with my own microcontroller that I can put in instead.  That way I can have more control over how the Raptor moves, the noises it makes, how it communicates, etc.  So while that’s an ambitious endeavor, it’s my ultimate goal!