PiGRRL Switch: Test Controller in Action!

2017-03-18 12.55.34

So I didn’t get a chance to take a video of the controller in action, but I do have a picture of me playing the original Zelda using my soldered up controller prototype. The Raspberry Pi and LCD is hooked up to one battery bank and the controller is hooked up to a separate one. Going through this test I noticed a few things that can be improved.

For starters, I had a hellish time getting enough power to both the Raspberry Pi and LCD. For whatever reason the power supply didn’t seem to be able to give enough juice to keep the Raspberry Pi from rebooting. The display has two micro USB ports on it that are used to power the screen and provide touchscreen feedback to the Pi. However since I currently don’t need to use the touchscreen I just powered the screen directly from the battery bank. To circumvent these power issues in the future I’ll need to see if I can get the Pi to provide more current from the USB ports. If that doesn’t work I can use one of the micro USB ports for touchscreen feedback and cut the red power wire to prevent the screen from drawing power from that port and use the second USB port to power the screen directly from the battery bank.

Another issue I noticed with my controller is the lack of menu buttons. The RetroPie interface requires using Start and Select buttons for navigation, so I had to map those to an attached keyboard since I only have four action buttons and four directional buttons on my prototype. Additionally I’ll also want shoulder buttons since the later consoles all have at least two and the PS1 has four.

The next step for the controller interface is designing the PCB!

PiGRRL Switch: Controller Test Firmware

As a follow-up to my last post, I wanted to go over the Arduino firmware I used for my primary controller test.  Essentially what it does is read the states of multiple buttons and print a sequence of 1’s or 0’s (1 for unpressed and 0 for pressed) to the Bluetooth module.


enum DPad {RIGHT = 2, UP = 3, LEFT = 4, DOWN = 5};
enum Buttons {A = 6, B = 7, X = 8, Y = 9};
void setup() {
// Set the DPad buttons to pullup inputs
pinMode(RIGHT, INPUT_PULLUP);
pinMode(UP, INPUT_PULLUP);
pinMode(LEFT, INPUT_PULLUP);
pinMode(DOWN, INPUT_PULLUP);
// Set the action buttons to pullup inputs
pinMode(A, INPUT_PULLUP);
pinMode(B, INPUT_PULLUP);
pinMode(X, INPUT_PULLUP);
pinMode(Y, INPUT_PULLUP);
Serial.begin(9600);
delay(5000);
}
void loop() {
// Print the Key packet
// DPad
Serial.print("K:");
Serial.print(digitalRead(RIGHT));
Serial.print(digitalRead(UP));
Serial.print(digitalRead(LEFT));
Serial.print(digitalRead(DOWN));
// Action buttons
Serial.print(digitalRead(A));
Serial.print(digitalRead(B));
Serial.print(digitalRead(X));
Serial.println(digitalRead(Y));
delay(10);
}

The whole program is embedded above and I’ll go through each section individually and explain what it does below.


enum DPad {RIGHT = 2, UP = 3, LEFT = 4, DOWN = 5};
enum Buttons {A = 6, B = 7, X = 8, Y = 9};

These enums serve the purpose of defining human-readable names for the pins that are hooked up to the buttons.  That way their usage in the code is immediately obvious upon reading.

Nasty literals!

This isn’t strictly necessary due to the simplicity of this program, but the general best practice is to not use integer literals in code.


// Set the DPad buttons to pullup inputs
pinMode(RIGHT, INPUT_PULLUP);
pinMode(UP, INPUT_PULLUP);
pinMode(LEFT, INPUT_PULLUP);
pinMode(DOWN, INPUT_PULLUP);
// Set the action buttons to pullup inputs
pinMode(A, INPUT_PULLUP);
pinMode(B, INPUT_PULLUP);
pinMode(X, INPUT_PULLUP);
pinMode(Y, INPUT_PULLUP);

This is the pin mode setup in the Arduino setup() function.  The buttons are hooked up directly to ground so that when the button is pressed, the pin sees 0V.  When not pressed the pin is floating so here I’m enabling the Arduino’s pullup resistors so that when the button isn’t pressed the pin sees 5V.


// Print the Key packet
// DPad
Serial.print("K:");
Serial.print(digitalRead(RIGHT));
Serial.print(digitalRead(UP));
Serial.print(digitalRead(LEFT));
Serial.print(digitalRead(DOWN));
// Action buttons
Serial.print(digitalRead(A));
Serial.print(digitalRead(B));
Serial.print(digitalRead(X));
Serial.println(digitalRead(Y));

Here the state of the buttons is read and printed to the Bluetooth module over UART.  Because of the way pullups are handled, ‘1’ corresponds to unpressed and ‘0’ corresponds to pressed.  The printing of the button states is preceded by a “K:” to act as a key-value pair.  This means that when the controller driver on the Raspberry Pi sees a “K” it knows it’s about to get a list of digital buttons.  I’m planning on using this in the future to enable more complex data than just a few buttons.  For example, I could use the keys “JX” and “JY” to denote the two axis values of an analog joystick.  The last serial print is a Serial.println() function call to print a newline character.  This is an easy way to segment the data so the controller driver can differentiate between multiple packets.


delay(10);

Lastly is the delay that sets the update rate of the program.  With a delay of 10 milliseconds at the end of the loop, new button states will be sent to the driver at an update rate of roughly 100Hz.  Right now this rate was arbitrarily decided, but in the future I’ll have to tweak it to balance several factors.

First off is the bandwidth of the Bluetooth connection.  As far as I can tell, the HC-05 module uses Bluetooth 2.0 which has a data throughput of of 2.1 Mbit/s or 275251.2 bytes per second.  Divide this by the update rate and you get the total amount of data that can be sent in one cycle of the program.  This also assumes that the connection is perfect with no lost packets.  Bluetooth’s serial port profile uses re-transmission to verify that packets sent are received.  This means that wireless noise can cause the connection to slow down while the Bluetooth module is attempting to resend a bad packet.  Additionally, if I go back to the BLE module and am able to get it working, BLE only has a throughput of 35389.44 bytes per second, or 13% of Bluetooth 2.0.  So while I’d be saving power I’d be restricting the update rate and amount of data I can send from the controller.

Another thing to consider is the update rate of the controller driver running on the Raspberry Pi.  Serial ports have buffers to store incoming data.  This means that if the driver is updating slower than the controller, the driver won’t be able to process all of the data coming in and the buffer will overflow.  This problem will exhibit itself as lag in the controls.  The driver needs to be updating at the same rate, or faster, than the controller.

The last consideration is debouncing.  This is a topic that has been extensively covered elsewhere, but suffice it to say that if the controller updates too fast it will accidentally send phantom bounces in the switch contacts to the driver and cause erroneous button presses.

Once everything has come together and is working to a degree I’m happy with, I’ll come back to these small issues and tweak everything until it works perfectly.  Until then, my next step is writing the controller driver!  As always, my code is available on my GitHub in the PiGRRL_Switch repository.

PiGRRL Switch: Creating the Controllers

With the screen chosen and working, the next step for creating the PiGRRL Switch was prototyping the controllers.  Initially, I wanted something cheap and easy like a breakout board that acted as a Bluetooth HID joystick.  I was immediately drawn to Adafruit’s EZ-Key which acts as a Bluetooth keyboard.  At the time I’m writing this, however, it’s out of stock and seems to have been for a while.  Additionally, because it acts as a Bluetooth keyboard and not a joystick, it rules out any possibility of adding analog controls in the future.

Another alternative to a Bluetooth HID breakout would be taking apart a cheap Bluetooth joystick and putting it in a 3D printed casing.  However, I decided this would greatly reduce the design flexibility of the controllers and might make it difficult to reconfigure the controllers on the fly (i.e. using two JoyCons as one controller vs. using them as two separate controllers).

So with those two options off the table I decided instead to use a Bluetooth serial bridge.  The HM-10 BLE and HC-05 Bluetooth 2.0 modules are both cheap and plentiful and provide a good solution at the cost of some extra work.  These modules can be hooked up to the UART of an Arduino and paired via Bluetooth.  Once connected, it acts as a virtual serial port in Linux, allowing the serial data to be read just as if the Arduino was connected via USB or FTDI.  The only exception to this is that it doesn’t support firmware loading wirelessly.

2017-03-13 22.18.32

The next step was setting up the initial design on a breadboard.  Above is an Arduino Pro Mini, four pushbuttons wired to the digital pins, and the HM-10 BLE module.  I decided to use the HM-10 because of the lower power requirements (BLE being an initialism for Bluetooth Low Energy).  The code for the Arduino reads the values from the digital pins and prints out eight characters to signify which buttons are pressed (‘1’ for unpressed and ‘0’ for pressed).  Right now I’m using a byte for each button which is wasteful, so I’ll go back at some point in the future and make the code more efficient so each button is represented by a bit.

2017-03-14 22.50.16

Once everything was wired up and running I had a lot of trouble finding an app that could connect to the HM-10 as a serial terminal.  Apparently the BLE standard has a lot of bells and whistles that make configuration a bit more difficult.  After trying several different apps I eventually found Serial Bluetooth Terminal which can connect to both BLE and regular Bluetooth devices via a serial terminal.  Above is screenshot of my phone connected to the controller with the button status being transmitted.

2017-03-14 20.31.24

2017-03-14 20.31.37

With my proof of concept working, I soldered everything onto a proto-board, this time with eight buttons to serve as a D-pad and four action buttons.

With that complete the next step was connecting to the Raspberry Pi over a serial terminal.  Unfortunately, this was much more difficult than I expected.  I could pair and connect to the HM-10, but couldn’t find a way to mount it as a serial terminal.

2017-03-15 21.01.17

Rather than continue further down the rabbit hole, I decided to drop the BLE module for now and switch to the HC-05 modules I bought as a backup.  Those have been around for years and have been used extensively with Arduino and Raspberry Pi.  Once that module was paired and connected, mounting it as a serial terminal was as simple as using the following commands to connect and then print out the values read from the module:

sudo rfcomm bind /dev/rfcomm0 <MAC Address>
sudo cat /dev/rfcomm0

2017-03-17 19.20.25

2017-03-17 22.01.22

Lastly I connected the controller, screen, and Raspberry Pi to battery packs and verified everything still worked as suspected.  Success!  The next step is writing a program for Linux that reads the button data coming off the serial port and uses it to emulate a controller for the console.

Alien Clock: Description

Note: This is a mirror of my "Alien Cuckoo Clock project" submitted to the 2017 Hackaday SciFi contest. For more information, visit the project link.

The Alien Cuckoo Clock consists of several discrete pieces that will be combined to form the clock. The different pieces are:

  • Facehugger
  • Chestburster
  • Clock mechanism
  • Cuckoo mechanism
  • Mannequin
  • Electronics

Thingiverse is a great repository for premade 3D printed files and many of the designers there are far more skilled in 3D modelling than I will ever be, so I’ll be reusing open source designs from there for the Facehugger and Chestburster. I will, however, be painting them myself.

Rather than reinventing the wheel (errr… clock) I decided to buy a clock mechanism to use as the clock internals and hands. The upside of this is not having to handle clock controls or complicated gearing, but the downside is that the cuckoo mechanism will have to be synced with the clock somehow so it can be properly triggered at the top of the hour.

I’ll be designing the cuckoo mechanism myself, probably using gears or other basic components from Thingiverse. I have quite a few spare, generic, yellow motors from OpenADR, so I anticipate designing a mechanism to convert that rotary motion into the linear motion required by the Chestburster cuckoo.

The mannequin will serve as the body of the Facehugger/Chestburster victim (chestburstee?). I anticipate finding a spare full-body mannequin, if possible, and cutting off the lower portion and arms. However, if I can’t find one I can create the torso using duct tape casting and find a lifelike mannequin head on Amazon.

I’m hoping to keep the electronics for this project as simple as possible. I have plenty of Arduino’s laying around so I’ll be using one for the controller in addition to a few end-stop switches for the cuckoo mechanism and maybe a hall effect sensor to detect the top of the hour.

OpenADR: Esp32 Initial Test

As I mentioned in a previous post, I’m attempting to test the ESP32 as a cheap replacement for my current Raspberry Pi and Arduino Mega setup.  I’m hoping that doing this will save me some complexity and money in the long run.

Unfortunately, with the ESP32 being a new processor, there’s not a lot of Arduino support implemented yet.  Neither the ultrasonic sensor or the motor driver libraries compile yet due to unimplemented core functions in the chip library.

Luckily a new version of the ESP-IDF (IoT Development Framework) is set to be released November 30,with the Arduino core most likely being updated a few days later.  This release will implement a lot of the features I need and so I’ll be able to continue testing out the board.  In the mean time I plan on adding more sensors to the Navigation Chassis v0.3 in preparation, as well as working on some other projects.

OpenADR: Connecting to Raspberry Pi

2016-06-11 23.25.45

The most frustrating part of developing the wall following algorithm from my last post was the constant moving back and forth between my office, to tweak and load firmware, and test area (kitchen hallway).  To solve this problem, and to overall streamline development, I decided to add a Raspberry Pi to the robot.  Specifically, I’m using a Raspberry Pi 2 that I had lying around, but expect to switch to a Pi Zero once I get code the code and design in a more final state.  By installing the Arduino IDE and enabling SSH, I’m now able to access and edit code wirelessly.

Having a full-blown Linux computer on the robot also adds plenty of opportunity for new features.  Currently I’m planning on adding camera support via the official camera module and a web server to serve a web page for manual control, settings configuration, and data visualization.

While I expected hooking up the Raspberry Pi as easy as connecting to the Arduino over USB, this turned out to not be the case.  Having a barebones SBC revealed a few problems with my wiring and code.

The first issue I noticed was the Arduino resetting when the motors were running, but this was easily attributable to the current limit on the Raspberry Pi USB ports.  A USB 2.0 port, like those on the Pi, can only supply up to 500mA of current.  Motors similar to the ones I’m using are specced at 250mA each, so having both motors accelerating suddenly to full speed caused a massive voltage drop which reset the Arduino.  This was easily fixed by connecting the motor supply of the motor controller to the 5V output on the Raspberry Pi GPIO header.  Additionally, setting the max_usb_current flag in /boot/config.txt allows up to 2A on the 5V line, minus what the Pi uses.  2A should be more than sufficient once the motors, Arduino, and other sensors are hooked up.

The next issue I encountered was much more nefarious.  With the motors hooked up directly to the 5V on the Pi, changing from full-speed forward to full-speed backward caused everything to reset!  I don’t have an oscilloscope to confirm this, but my suspicion was that there was so much noise placed on the power supply by the motors that both boards were resetting.  This is where one of the differences between the Raspberry Pi and a full computer is most evident.  On a regular PC there’s plenty of space to add robust power circuitry and filters to prevent noise on the 5V USB lines, but because space on the Pi is at a premium the minimal filtering on the 5V bus wasn’t sufficient to remove the noise caused by the motors.  When I originally wrote the motor controller library I didn’t have motor noise in mind and instead just set the speed of the motor instantly.  In the case of both motors switching from full-speed forward to full-speed backward, the sudden reversal causes a huge spike in the power supply, as explained in this app note.  I was able to eventually fix this by rewriting the library to include some acceleration and deceleration.  By limiting the acceleration on the motors, the noise was reduced enough that the boards no longer reset.

While setting up the Raspberry Pi took longer than I’d hoped due to power supply problems, I’m glad I got a chance to learn the limits on my design and will keep bypass capacitors in mind when I design a permanent board.  I’m also excited for the possibilities having a Raspberry Pi on board provides and look forward to adding more advanced features to OpenADR!

OpenADR: Wall Following

After several different attempts at wall following algorithms and a lot of tweaking, I’ve finally settled on a basic algorithm that mostly works and allows the OpenADR navigation unit to roughly follow a wall.  The test run is below:

Test Run

 

Algorithm

I used a very basic subsumption architecture as the basis for my wall following algorithm.  This just means that the robot has a list of behaviors that in performs.  Each behavior is triggered by a condition with the conditions being prioritized.  Using my own algorithm as an example, the triggers and behaviors of the robot are listed below:

  • A wall closer than 10cm in front triggers the robot to turn left until the wall is on the right.
  • If the front-right sensor is closer to the wall than the right sensor, the robot is angled towards the wall and so turns left.
  • If the robot is closer than the desired distance from the wall, it turns left slightly to move further away.
  • If the robot is further than the desired distance from the wall, it turns right slightly to move closer.
  • Otherwise it just travels straight.

The robot goes through each of these conditions sequentially, and if a certain condition is met the robot performs the triggered action and then skips the rest and doesn’t bother checking the rest of the conditions.  As displayed in the test run video this method mostly works but certainly has room for improvement.

Source


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
#define TARGET_DISTANCE 6
#define TOLERANCE 2
// Mapping of All of the distance sensor angles
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
// Array of distance sensor objects
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
uint16_t Distances_Previous[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
// Initialize all distances to 0
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = 0;
Distances_Previous[i] = 0;
}
}
void loop()
{
updateSensors();
// If there's a wall ahead
if (Distances[DEG_90] < 10)
{
uint8_t minDir;
Serial.println("Case 1");
// Reverse slightly
motors.backward(FULL_SPEED);
delay(100);
// Turn left until the wall is on the right
do
{
updateSensors();
minDir = getClosestWall();
motors.turnLeft(FULL_SPEED);
delay(100);
}
while ((Distances[DEG_90] < 10) && (minDir != DEG_0));
}
// If the front right sensor is closer to the wall than the right sensor, the robot is angled toward the wall
else if ((Distances[DEG_45] <= Distances[DEG_0]) && (Distances[DEG_0] < (TARGET_DISTANCE + TOLERANCE)))
{
Serial.println("Case 2");
// Turn left to straighten out
motors.turnLeft(FULL_SPEED);
delay(100);
}
// If the robot is too close to the wall and isn't getting farther
else if ((checkWallTolerance(Distances[DEG_0]) == –1) && (Distances[DEG_0] <= Distances_Previous[DEG_0]))
{
Serial.println("Case 3");
motors.turnLeft(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// If the robot is too far from the wall and isn't getting closer
else if ((checkWallTolerance(Distances[DEG_0]) == 1) && (Distances[DEG_0] >= Distances_Previous[DEG_0]))
{
Serial.println("Case 4");
motors.turnRight(FULL_SPEED);
delay(100);
motors.forward(FULL_SPEED);
delay(100);
}
// Otherwise keep going straight
else
{
motors.forward(FULL_SPEED);
delay(100);
}
}
// A function to retrieve the distance from all sensors
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances_Previous[i] = Distances[i];
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
delay(1);
}
}
// Retrieve the angle of the closest wall
uint8_t getClosestWall()
{
uint8_t tempMin = 255;
uint16_t tempDist = 500;
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
if (min(tempDist, Distances[i]) == Distances[i])
{
tempDist = Distances[i];
tempMin = i;
}
}
return tempMin;
}
// Check if the robot is within the desired distance from the wall
int8_t checkWallTolerance(uint16_t measurement)
{
if (measurement < (TARGET_DISTANCE – TOLERANCE))
{
return1;
}
else if (measurement > (TARGET_DISTANCE + TOLERANCE))
{
return 1;
}
else
{
return 0;
}
}

Conclusion

There are still plenty of problems that I need to tackle for the robot to be able to successfully navigate my apartment.  I tested it in a very controlled environment and the algorithm I’m currently using isn’t robust enough to handle oddly shaped rooms or obstacles.  It also still tends to get stuck butting against the wall and completely ignores exterior corners.

Some of the obstacle and navigation problems will hopefully be remedied by adding bump sensors to the robot.  The sonar coverage of the area around the robot is sparser than I originally thought and the 2cm blindspot around the robot is causing some problems.  The more advanced navigation will also be helped by improving my algorithm to build a map of the room in the robot’s memory in addition to adding encoders for more precise position tracking.

OpenADR: Test Platform

The second round of the Hackaday Prize ends tomorrow so I’ve spent the weekend hard at work on the OpenADR test platform.  I’ve been doing quite a bit of soldering and software testing to streamline development and to determine what the next steps I need to take are.  This blog post will outline the hardware changes and software testing that I’ve done for the test platform.

Test Hardware

While my second motion test used an Arduino Leonardo for control, I realized I needed more GPIO if I wanted to hook up all the necessary sensors and peripherals.  I ended up buying a knockoff Arduino Mega 2560 and have started using that.

2016-05-28 19.00.22

2016-05-28 19.00.07I also bought a proto shield to make the peripheral connections pseudo-permanent.

2016-05-29 11.47.13Hardwiring the L9110 motor module and the five HC-SR04 sensors allows for easy hookup to the test platform.

HC-SR04 Library

The embedded code below comprises the HC-SR04 Library for the ultrasonic distance sensors.  The trigger and echo pins are passed into the constructor and the getDistance() function triggers the ultrasonic pulse and then measures the time it takes to receive the echo pulse.  This measurement is then converted to centimeters or inches and the resulting value is returned.


#ifndef HCSR04_H
#define HCSR04_H
#include "Arduino.h"
#define CM 1
#define INCH 0
class HCSR04
{
public:
HCSR04(uint8_t triggerPin, uint8_t echoPin);
HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout);
uint32_t timing();
uint16_t getDistance(uint8_t units, uint8_t samples);
private:
uint8_t _triggerPin;
uint8_t _echoPin;
uint32_t _timeout;
};
#endif

view raw

HCSR04.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "HCSR04.h"
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = 24000;
}
HCSR04::HCSR04(uint8_t triggerPin, uint8_t echoPin, uint32_t timeout)
{
pinMode(triggerPin, OUTPUT);
pinMode(echoPin, INPUT);
_triggerPin = triggerPin;
_echoPin = echoPin;
_timeout = timeout;
}
uint32_t HCSR04::timing()
{
uint32_t duration;
digitalWrite(_triggerPin, LOW);
delayMicroseconds(2);
digitalWrite(_triggerPin, HIGH);
delayMicroseconds(10);
digitalWrite(_triggerPin, LOW);
duration = pulseIn(_echoPin, HIGH, _timeout);
if (duration == 0)
{
duration = _timeout;
}
return duration;
}
uint16_t HCSR04::getDistance(uint8_t units, uint8_t samples)
{
uint32_t duration = 0;
uint16_t distance;
for (uint8_t i = 0; i < samples; i++)
{
duration += timing();
}
duration /= samples;
if (units == CM)
{
distance = duration / 29 / 2 ;
}
else if (units == INCH)
{
distance = duration / 74 / 2;
}
return distance;
}

view raw

HCSR04.cpp

hosted with ❤ by GitHub

L9110 Library

This library controls the L9110 dual h-bridge module.  The four controlling pins are passed in.  These pins also need to be PWM compatible as the library uses the analogWrite() function to control the speed of the motors.  The control of the motors is broken out into forward(), backward(), turnLeft(), and turnRight() functions whose operation should be obvious.  These four simple motion types will work fine for now but I will need to add finer control once I get into more advanced motion types and control loops.


#ifndef L9110_H
#define L9110_H
#include "Arduino.h"
class L9110
{
public:
L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB);
void forward(uint8_t speed);
void backward(uint8_t speed);
void turnLeft(uint8_t speed);
void turnRight(uint8_t speed);
private:
uint8_t _A_IA;
uint8_t _A_IB;
uint8_t _B_IA;
uint8_t _B_IB;
void motorAForward(uint8_t speed);
void motorABackward(uint8_t speed);
void motorBForward(uint8_t speed);
void motorBBackward(uint8_t speed);
};
#endif

view raw

L9110.h

hosted with ❤ by GitHub


#include "Arduino.h"
#include "L9110.h"
L9110::L9110(uint8_t A_IA, uint8_t A_IB, uint8_t B_IA, uint8_t B_IB)
{
_A_IA = A_IA;
_A_IB = A_IB;
_B_IA = B_IA;
_B_IB = B_IB;
pinMode(_A_IA, OUTPUT);
pinMode(_A_IB, OUTPUT);
pinMode(_B_IA, OUTPUT);
pinMode(_B_IB, OUTPUT);
}
void L9110::forward(uint8_t speed)
{
motorAForward(speed);
motorBForward(speed);
}
void L9110::backward(uint8_t speed)
{
motorABackward(speed);
motorBBackward(speed);
}
void L9110::turnLeft(uint8_t speed)
{
motorABackward(speed);
motorBForward(speed);
}
void L9110::turnRight(uint8_t speed)
{
motorAForward(speed);
motorBBackward(speed);
}
void L9110::motorAForward(uint8_t speed)
{
digitalWrite(_A_IA, LOW);
analogWrite(_A_IB, speed);
}
void L9110::motorABackward(uint8_t speed)
{
digitalWrite(_A_IB, LOW);
analogWrite(_A_IA, speed);
}
void L9110::motorBForward(uint8_t speed)
{
digitalWrite(_B_IA, LOW);
analogWrite(_B_IB, speed);
}
void L9110::motorBBackward(uint8_t speed)
{
digitalWrite(_B_IB, LOW);
analogWrite(_B_IA, speed);
}

view raw

L9110.cpp

hosted with ❤ by GitHub

Test Code

Lastly is the test code used to check the functionality of the libraries.  It simple tests all the sensors, prints the output, and runs through the four types of motion to verify that everything is working.


#include <L9110.h>
#include <HCSR04.h>
#define HALF_SPEED 127
#define FULL_SPEED 255
#define NUM_DISTANCE_SENSORS 5
#define DEG_180 0
#define DEG_135 1
#define DEG_90 2
#define DEG_45 3
#define DEG_0 4
uint16_t AngleMap[] = {180, 135, 90, 45, 0};
HCSR04* DistanceSensors[] = {new HCSR04(23, 22), new HCSR04(29, 28), new HCSR04(35, 34), new HCSR04(41, 40), new HCSR04(47, 46)};
uint16_t Distances[NUM_DISTANCE_SENSORS];
L9110 motors(9, 8, 3, 2);
void setup()
{
Serial.begin(9600);
pinMode(29, OUTPUT);
pinMode(28, OUTPUT);
}
void loop()
{
updateSensors();
motors.forward(FULL_SPEED);
delay(1000);
motors.backward(FULL_SPEED);
delay(1000);
motors.turnLeft(FULL_SPEED);
delay(1000);
motors.turnRight(FULL_SPEED);
delay(1000);
motors.forward(0);
delay(6000);
}
void updateSensors()
{
for (uint8_t i = 0; i < NUM_DISTANCE_SENSORS; i++)
{
Distances[i] = DistanceSensors[i]->getDistance(CM, 5);
Serial.print(AngleMap[i]);
Serial.print(":");
Serial.println(Distances[i]);
}
}

view raw

basicTest.ino

hosted with ❤ by GitHub

As always, all the code I’m using is open source and available on the OpenADR GitHub repository.  I’ll be using these libraries and the electronics I assembled to start testing some motion algorithms!

Big Seven Segment Display

Recently at work my team was throwing around the idea of real-time feedback by adding a big counter to the workspace to keep track of things like time left in the development cycle, number of static analysis warnings, etc.  For that reason I decided to build a large Seven Segment Display to provide that continuous visual feedback.  I designed the display completely from scratch to serve as a simple project that utilized a wide variety of skills.  Below I go through each step of the process and how each part works.

Mechanics

2015-07-24 17.30.41

For the actual segment part of the display I used transparent PLA from Hatchbox.  I didn’t have any trouble with how these turned out and am happy with the filament.  I had to do some experimentation with infill percentage in order to get the right amount of light passing through.  I believe in the end I settled on 20%.

2015-07-24 17.30.51

Here’s a close up of the semi-transparent digits.  Two 5mm LEDs fit snugly into the holes int he back.

2015-07-25 22.18.20

 

The individual segments are designing to press fit into black PLA frames.  I “welded” the four separate frames together using a 3Doodler to extrude plastic into the gaps.  For aesthetics I only welded the backs of the frames together, so while they’ll stay connected the seams won’t hold up to much abuse.

Electronics

Segment_Schem    Segment_Board.png

To keep board size small and reduce complexity I designed individual boards for each segment.  They only consist of two 5mm through hole LEDs, a current limiting resistor, and a 0.1″ header to connect to the main board.

2016-04-03 17.59.20

Here’s a pic of the boards soldered up and hot glued into the display.

Controller_Schem

Controller_Board.png

The control board is just a breakout for the TPIC6B595 which is a high current shift register that can power LEDs and be daisy chained together using SPI.

SparkFun ESP8266 Thing - Dev Board

I’m using the Sparkfun ESP8266 Thing Dev Board as the brains of the project.  This nifty little WiFi controller is Arduino-compatible and allows me to communicate with the display without needing to connect it to a USB port.

Firmware

Like I mentioned above, I’m using the SPI bus to communicate to each individual LED controller and daisy chaining them all together.  All that’s required is to write out four separate 8-bit numbers to set the appropriate LEDs.

The server side of things is the most complicated part of the project.  I borrowed heavily from the Web Server example in Sparkfun’s hookup guide.  The Thing board hosts a web server which accepts GET requests with the number to set the display two.  It then parses the string version of the number and converts it to the correct 8-bit sequence to set the seven segment display.

Software

The client side of things involved writing a Python script to submit the GET requests based on what data is being measured.  The current example is going through a directory and counting the occurrences of “if” in the code.

Source

As always, the source code and 3D files will be up on my Github.  I’m travelling this weekend, but will upload all the files Monday night.

Improvements

2016-04-08 12.43.14

While the LED display was plenty bright in my dim home office, I found out that they weren’t nearly strong enough to overcome the bright fluorescent lights at my office.  It’s difficult to see in the picture above, but it’s hard to read the numbers from more than a few feet away.  When I originally designed the display I erred on the side of caution and ran the cheap eBay LEDs at ~3mA.  Unfortunately, this was not nearly enough.  While it would be relatively easy to swap out the resistors to increase the current to the LEDs, I’m also unhappy with how much the plastic segments diffuse the light and think just making the LEDs brighter would only exacerbate the problem.

For version two I think I’ll experiment with two different ways to fix the problems above.  One option would be to use “straw hat” LEDs instead of the generic 5mm kind.  These LEDs have improved light diffusion.  I could also use hemispherical holes in the plastic digits, rather than cylindrical ones, so that the light is directed in a wider pattern.

Another option would be using WS2812B.  In conjunction with an improved plastic digit to help with light diffusion, this LED would greatly simplify the electronics of the display.  These LED modules have built in resistors and control logic, allowing them to be daisy chained and controlled via a serial interface without any need for the shift register control board that I used for version one.  WS2812Bs are also RGB LEDs, so another benefit is that the color would be controllable.

Hopefully I’ll get a chance to start on version two soon!

Overly Dramatic Compile Button

As a software engineer I’ve long been unimpressed with the triviality associated with compiling code.  Surely the building of a masterful creation involving hundreds of source files and complex algorithms deserves more than just a small keyboard shortcut.  There should be drama, there should be flair, there should be maniacal laughter!

Featured image

To fill these requirements I’ve created the Build Button!  Using the wonderfully dramatic Adafruit Massive Arcade Button, a mini Arduino Kickstarter reward, and a 3D printed enclosure,  I made a button that communicates serially to a computer through the USB port.  A Python program on the computer monitors the currently active window and tells the button to light up when the active program matches a list of programs with build shortcuts.  When the button tells the computer that it’s been pressed the computer executes the keyboard shortcut associated with the currently active window to compile the code.

The source for this project is on my GitHub.