Jon's Place

Friday, July 18, 2014

NanoSeeker Version 2

So, its been quite a while since I last posted here. I spent most of April and May traveling, including going to Maker Faire in San Mateo, and JSConf in Florida. While I was at JSConf, I ran into some of the guys from OpenROV, and we immediately starting talking about ROVs and AUVs. I had brought NanoSeeker with me, just to show to anyone who was interested, and we found out completely accidentally that the battery holder tubes for an OpenROV are exactly the right size to make a transparent shell for NanoSeeker.

NanoSeeker with a clear shell
Eric Stackpole, one of the founders of OpenROV, gave me a spare one they had with them, and I immediately started thinking about what I could do with a new version of NanoSeeker. I built the first version of NanoSeeker in 2009, and a lot has changed in the home-brew electronics space since then.

I decided right off the bat that it should run MicroPython (and thus use an STM32F405 as the processor), and incorporate a full 9-axis IMU, as well as a depth sensor and a speed sensor. I started working on the CAD model, and got it to the point where it was ready to be printed. I bought my own material cartridge for the 3D printer I use for work, so over the past week I printed the parts I would need.

NanoSeeker v2, beside the old v1
I need to find another o-ring that is the right size, and of course I need to completely redesign the electronics, but I think this is going to work out nicely. I'm going to include a bluetooth module on the board, which will allow me to update the python scripts driving it while it is on the bench, without having to open it up.

I'm really looking forward to having a micro AUV that I can program in a reasonable language, without having to go through a lot of crazy hoops.

Friday, February 14, 2014

MicroPython Board With a Crystal, and a New Motor Driver Board


So, I ordered a few of the crystals Damien used on the official MicroPython board, and although physically it isn't a perfect fit, it works, so I now have a nice clean board.


I also got rid of the orange LED, and replaced it with a proper blue LED (I didn't have any of those left when I put the board together). So, everything is as it should be, and is working nicely.

I also ordered a couple new motor driver boards, and have one hooked up now - it is working great. uCee will be back together again shortly, driving around with my custom MicroPython board (all the videos shown up until now were with the Teensy 3.1 running MicroPython).

New Motor Driver Board
Things are back to working, which is a good thing. I'll be bringing uCee (along with Roz) to PyCon in April in Montreal, since both robots are powered by Python.

Friday, February 7, 2014

Of H-Bridges and 3.3 Volts

So, in another twist of this ongoing saga, it turns out the h-bridge I was using for uCee is not designed to work properly with 3.3 volt logic. I had it hooked up, but the motors weren't turning. I took it all apart, hooked up my logic analyzer to the logic pins, and sure enough I was getting a valid signal. I took my multimeter and was going to test the battery voltage to make sure everything was okay, but when I touched the battery probe to the ground pin on my MicroPython board, the motor started running.

That was weird. So I reset it, and tried again. Same results. I tried it with the tip of a screwdriver, touching one of the +5 volt pins - same results.

I went and found the (now discontinued) product page on Pololu.com, and sure enough, the minimum required voltage is around 4.2 volts.

This is only the second robot I've built with 3.3 volt logic (NanoSeeker was the first), and the ARM chip I'm using (STM32F405) has 5-volt tolerant logic pins, so I haven't had to think about it too much so far. Clearly, I didn't think enough about it. It worked fine with the Teensy 3.1, but it clearly doesn't with my new board.

Anyways, I found this cool Canadian company called Diigiit Robotics, that had one of Pololu's new much cheaper h-bridges in stock, so I ordered it. This h-bridge will work with logic down to 2 volts, so I should be safe.

I'll have to print a mounting plate to hold this board in place - the board is much smaller than my current h-bridge, and has no mounting holes. I'll design a plate that is in the shape of the old h-bridge, with the middle cut out and some clips to hold the board in place.

Monday, February 3, 2014

MicroPython Board Working

Things weren't looking that great as of my last update, but I soldered in an 8 MHz ceramic resonator I had in one of my parts bags, and the USB stuff magically started working!

So, I have a fully functional MicroPython board now. Below is a picture showing my new board on the left, with my Teensy 3.1 carrier board on the right. The new board is a few millimeters longer, but the same width. The connectors are laid out in a much more usable fashion. The four wires you see coming out of the connector go to a couple push buttons, and allow me to reset the board, and also put it into DFU programming mode. Once the MicroPython software gets more stable and complete, I won't be re-flashing the board much, so I can remove the buttons. Its hard to get at the surface of the board while its in the robot, so I didn't want to put push-buttons on the board.

Two Boards Running MicroPython

Sunday, February 2, 2014

MicroPython Boards, Crystal Trouble, Flashing LEDs

So, its a long and sordid tale. I got my boards back on Friday:


They sure look pretty. I soldered up the first board, and found out that I had switched the polarity of my battery connector between the Teensy and this board, so I let out a bunch of magic smoke when I first plugged it in. Fortunately, it was just the 5 volt regulator that smoked, and not the ARM chip, so I replaced the regulator and fixed my battery plug. I hooked everything back up, and...

Nothing. No response over USB. I did a lot of troubleshooting, and determined that it was probably an oscillator issue.


That picture shows the BOOT0 line (third row), the RESET line (second row), and the oscillator (top row) hooked up to my logic analyzer. So you pull BOOT0 high, and then, while its high, pull RESET low. At that point, the chip is supposed to start up the external oscillator, which clearly isn't happening.

My brother suggested I could use one of the serial interfaces to the bootloader, and he went so far as to put together a blinking LED "Hello World" style application I could flash, along with the serial bootloader programming code from the Espruino project. I re-purposed my IMU port (which uses I2C normally, but the SCL/SDA lines are also Tx/Rx from USART3 on the chip). I removed the I2C pullup resistors, removed the crystal, grounded the USB D+ and D- lines, and plugged in an FT232 USB to TTL converter to my Ubuntu laptop. I was able to program the chip successfully using that setup, so now I know the other issue is a crystal issue, and not something else fundamental wrong with the ARM chip or my board.


So, I'm not sure where to go from here. The math says I should be using 6 pF capacitors with the crystal, but something clearly isn't working.

I'm going to code up an 8 MHz signal generator using my Teensy 3.1 board, and see if I can get USB working with an external clock source like that. If that works, then I need to try and figure out why this crystal isn't working.

Tuesday, January 28, 2014

Custom MicroPython Board

One of the things I've been working on in my spare time is a new board for uCee. I was originally going to use the actual MicroPython board that I will get from the KickStarter campaign, but I decided to build my own instead for a couple reasons:

  1. It will fit better, and be easier to connect everything to
  2. I want to start playing with more advanced micro-controllers, and building your own board is definitely a good way to make that happen
The other thing I decided after my work-related adventures with the ammeter board, is I wanted to use a PCB design tool that can generate the PCB with the appropriate connections from a schematic. Most hobbyists use Eagle for that, but I absolutely detest the user interface, so I decided to try a different package, namely KiCad. The user interface definitely takes some getting used to, but once I figured out most of the idiosyncrasies, it is quite usable.

Damien George (the guy behind MicroPython) very nicely published the schematic for his latest revision of the MicroPython board, so I shamelessly copied what I needed and added my own parts that uCee needs, and I ended up with something that should work.

Here's my schematic:


And here's what the PCB looks like, first with the bottom layer hidden (which makes the top much more understandable), and then with the bottom later visible:



KiCad can also show you a 3D rendering of what your board would look like, but you have to supply 3D models of any custom parts you include. My board has a whole pile of parts that I had to create my own footprint for (usb connector, uSD connector, all the Hirose DF-13 2, 3, 4, and 6-pin connectors, etc). So I went to the manufacturers website, and found CAD files for most of them, and the imported them into Rhino, scaled and positioned them correctly, and then exported them as STL files. Wings 3D, although nowhere near as capable as Rhino, has a WRL exporter that KiCad is designed for, so I imported the STL models into Wings 3D, colored them, and exported them as WRL files. I then applied the 3D model to each component footprint, which allows me to generate this:


All in all, not bad.

I plan on sending in this board to get made later this week, so hopefully I'll have my own MicroPython board inside uCee within a couple weeks.

Saturday, January 25, 2014

uCee With A Lid

So, the plan all along has been to print a lid for uCee, i just haven't gotten around to it until today.

Way back when I first started this project, the very first thing I did was create a CAD model of the robot:

uCee - CAD Model

Now that the robot finally has its top, you should be able to see the resemblance:

uCee - Real Robot

So, I think it has turned out pretty good, all in all. I just ordered a new pair of batteries for it, so by next weekend it will hopefully work a little better (the batteries I have now are ones I bought back in 2006 for NanoSeeker, and they are pretty much toast). I also have to put together 3 ProxDot sensors, and mount them.

For now, I'm going to work on the software, which can be found (in its current simplistic form) in my GitHub repository.

Here's another video, this time with the top on:


uCee - Machining Axles

So, this morning I cranked up my Sherline lathe, and machined some brass axles for the four idler wheels (the ones that aren't attached to a motor).

Idler axles machined from 3/8" brass rod
To give a sense of scale, the smaller part of the axle is 3mm in diameter and about 13mm long. The larger part is around 9mm in diameter, and 2.5mm long.

Here's a neat picture I took while I was putting it all back together again:


You can clearly see the white encoder disks on the two motors, as well as the rather tight space between the h-bridge board (on the bottom) and the Teensy carrier board above it. I don't have the ProxDot sensors assembled or installed yet - they plug into the small right-angle plugs on the bottom of the carrier board.

Friday, January 24, 2014

MicroPython

So uCee is now running MicroPython on the Teensy 3.1, and moving around, avoiding obstacles (in a very simplistic manner).


Its pretty cool, and I can already do a lot of the stuff I need to in order to run this robot. We need to add interrupt on change for input pins, so I can handle the encoders, and some I2C stuff to talk to the IMU. The rest is just coding...

You can see my current code in my Github repository.

Thursday, January 23, 2014

uCee Coming Together

So my Teensy 3.1 carrier board I ordered from OSH Park finally showed up (took 5 weeks to get here from when I ordered it, but I can't complain - the price was right).

I still have the idler wheel axles to machine (I'm going to machine them from brass on my Sherline lathe), and I still have to put together my ProxDot sensors, but everything else is put together now, so I'll hopefully only have to strip the robot back down one more time before it is ready to roll.

uCee - Getting Closer
 The h-bridge is mounted and wired up under there, although you can't see it. Here's a picture of the Teensy carrier board (populated), with a blank one beside it (OSH Park gives you three boards):


Here are some more pictures of the robot:




I also have to order the batteries, although for now I can run it off a wall supply. My brother has MicroPython (in its current form) ported to run on the Teensy 3.1, and he's got a cute little robot running off it, so I'll have this guy running Python pretty soon.

Thursday, January 2, 2014

uCee Motoring

Perhaps "motoring" is a bit of a stretch, because it certainly isn't moving yet, but the motor code is running and working.

Here's a quick video showing a simple test of the PWM speed control of one of the motors:


As I move my hand closer to the IR range finder sensor, it slows the PWM down. Very simple, but an easy way to test that speed control is working.

The code for this is on my Github repository. Right now the current readings I'm getting from the h-bridge board are all over the place when its running at full speed. I don't expect to ever run this robot at full speed, and the current readings settle down as the motor slows down, so hopefully that will make them useful. The encoders are rock-solid, using the interrupt pin capabilities of the Teensy 3.1.

I'm going to use this PID library to implement closed loop speed control with the encoders. There's a really good multi-part writeup discussing how it all works - highly recommended. I'm going to translate the encoder ticks into mm/second, and use that as the Input for the PID library. I will specify the requested speed (Setpoint) in mm/second also, and if my conversion routine is correct, the robot's speed should be fairly precise.

Sunday, December 29, 2013

uCee - Of Rangefinders and ProxDots

So, today I've been working on uCee (note the slight spelling difference from my previous post - I like uCee as a name better than uC).

Specifically, I've been working on getting my range finders working in a reasonable fashion. I'm using three GP2Y0A41SK0F Sharp IR analog range-finders. I'm powering them with 5 volts, but since the maximum voltage they return is about 3.1 volts, I can hook them directly to the analog inputs on my Teensy 3.1, which allows a max of 3.3 volts on analog inputs.

However, as anyone knows who has used them, Sharp IR range-finders have a curious piece of behavior that occurs as the object being sensed gets close enough - the signal inverts, and starts acting as if the object is moving away instead of getting closer. For the sensors above, that happens at 30 mm away. I was going to use a ProxDot sensor in conjunction with the range sensor, but they trigger when the object is about 45-50mm away. However, since ProxDots have 2 IR LEDs, I tried covering one with a tiny piece of black electrical tape, so there would be only half the strength of light for it to sense. Sure enough, with one LED covered, the ProxDot triggers at almost exactly 30mm, which is perfect.

The code (without the ProxDot) is as follows:
  int value = analogRead(sharpPin);
  float voltage = value * 0.0032258; // 0-1023 -> 0-3.3 volts
  float exactDistance = (100 * ((1.25 / voltage) - 0.15));
  int distance = (int)exactDistance;
  if (distance > 200)
    distance = -1;
So now I'm going to combine in the ProxDot sensor, and add a few lines of code:
  int value = analogRead(sharpPin);
  float voltage = value * 0.0032258; // 0-1023 -> 0-3.3 volts
  float exactDistance = (100 * ((1.25 / voltage) - 0.15));
  int distance = (int)exactDistance;
  if (distance > 200) {
    distance = -1;
  } else {
    int proxDot = digitalRead(proxDotPin); // signal is low if something is sensed, high otherwise
    if (!proxDot)
      distance = max(0, 55 - distance);
  }
And presto - I get a much more useful distance value at close ranges.

Friday, December 27, 2013

Roz - Walking Again

So Roz is now walking again, but this time with a Beaglebone Black running Python. I'll blog more of the details later, but for now here's a quick video:


Wednesday, December 18, 2013

uC (MicroCrawler)

So I've decided to call this robot uC, short for MicroCrawler. It is also a micro-controller based robot, so it fits double.

I got a 9-axis IMU and a bluemirf module in the mail today, and they fit nicely and look great. Here's a shot showing them mounted, with the proper gearmotors mounted, and the non-drive wheels temporarily attached. I'm pretty happy with how its turning out.

uC Is a Small Robot...
My Teensy carrier board is being made at OSH Park, and I should get that early in the new year. Here's what it looks like:

Teensy 3.1 Carrier Board
It has a lot of 0.050" spacing plugs, and uses all of the regular I/O pins on the Teensy.

I'm not going to be able to do much more to the robot itself until the carrier board arrives - I don't want to be soldering and then de-soldering wires from my sensors and h-bridge board, so I'll wait until I get the board to hook everything up. In the meantime, I'll be working on the software that will run on the Teensy, and possibly also I'll be working on the monitoring software that will run on my phone.

A few more random pictures:




Sunday, December 15, 2013

Yet Another Robot

So, while I was flying back home from RobotsConf (which was awesome, btw), I was paging through the copy of Make Magazine than they gave us. On page 64, I saw this cool little robot called CamBot, shown below.

Cambot by Dave Astolfo
 One of the issues I ran into with Roz while I worked on getting him walking was the constant need to reboot the Beaglebone Black, either because of USB issues or networking issues or power issues. I decided I wanted a robot to play with that went back to using a micro-controller, but I wanted to use something more powerful than your typical AVR. I supported the Micro Python kickstarter, but in the meantime I'm going to use a Teensy 3.1 to control this thing.

So, I fired up my CAD package, and came up with the model below. It uses treads from a Lego Technic set, but everything else is 3D printed.






Now, a week later, I have this:


The gear motors are placeholders for the real ones, which I've ordered, along with the optical encoders they support. You can see in the image below that they mount nicely using 1.6mm machine screws. This picture also gives you an idea of the scale - the robot will be 102mm long, 91mm wide, and 50mm high.


Shown below are some of the parts I'm using, including a dual-motor h-bridge, and a bunch of tiny connectors.


The robot will have 3 Sharp analog IR sensors, one on each side, and one in front. It will have, in addition to the Teensy 3.1 processor mentioned above, a 9 axis IMU, and a bluetooth module. The bluetooth module will allow me to write a simple app on my Android phone (Nexus 5) to control and get feedback from the robot. By "control", I really mean stuff like choosing a mission to run, since this robot (like all my others) will be fully autonomous. The robot also has optical encoders on each motor, with 500 counts per revolution of the drive wheel. That, along with full PWM control of the motors, will allow me to use a proper PID solution for control.

The motor driver board also provides analog feedback to the current draw of the motors, so I'll be able to detect if/when the motors are stalled.

The robot will use a pair of tiny 250 mAh lipo batteries in series.

At some point in the future I want to add a very small camera to this robot, and start playing with some very simple visual processing, like blob tracking and fiducial recognition.

Speaking of Roz, I managed to get him walking quite nicely at the conference using the NUKE engine running in Python on the Beaglebone Black. I didn't get any video, but I'll get some this week and post it here.

Friday, November 29, 2013

Roz - Ready To Go

Roz from the bottom
From a hardware perspective, Roz is now complete (for this revision, anyways). I soldered together a new auxiliary power board (since the old one didn't work), and made a new board I call the power distribution board, shown in the picture to the left. It has four terminal blocks, and a 5 volt 10 amp voltage regulator. Ten amps is a little overkill to power a Beaglebone black and a USB hub, but it was cheap, so I went for it.

Also visible in the picture is the 3-cell 11.1 volt Lithium Polymer battery (2000 mAh). Power goes from the battery, through the on/off main switch, and then up to the aux. power board. It then comes back down to the distribution board, where it is shunted (still at 12 volts) back up to the bioloid bus, and run through the 5 volt regulator for the BBB and the USB hub.

Next up, I need to write some python code, both on the desktop and on the BBB. The desktop code will be mainly utilities, that allow me to talk to the bioloid devices and set parameters and so on. On the BBB, it will be copying an implementation of NUKE (a nice inverse kinematics engine) and setting up a really simple finite state machine to make it walk around and not run into things.

Sunday, November 24, 2013

Roz & RobotsConf

So I've been getting Roz ready to go to RobotsConf, a cool conference in Florida in two weeks. Mozilla is one of the sponsors for the conference, so I managed to snag a free ticket.

Roz all wired up
In the picture to the left, Roz has the Beaglebone Black mounted on the back at the top. Under it, you can see (on the right side of the robot) a bioloid bus board on the left, and a bioloid power board on the right. The bus board allows a computer (like the BBB) to talk directly to the bioloid bus (and thus the AX-12 servos) over a high speed serial bus.

The power board does automatic switching between battery power and wall power. This allows you to hot-plug wall power so you don't kill the battery while you're debugging on the bench. This is particularly useful when you have a Linux single board computer, since you don't want to be constantly powering it up and down. Note that this board does not charge the battery - it simply chooses to provide power from a wall source if one is present, or from the battery if not. The battery is a 3-cell Lithium Polymer battery pack, with 2000 mAh of capacity.

This picture shows the other side, so you can see the 4-port USB hub on the left, and a new board I just made, which I call the bioloid mini-io. It is a bus device (like the servos), and it has 6 analog inputs (at 5 volts), and 8 digital I/O pins. I use this to feed the analog signals from the three Sharp IR sensors mounted on the head.

The head has the aforementioned analog Sharp IR range sensors, as well as a cheap two-camera web cam. This will hopefully allow me to do very simple stereo vision (at a very low frame rate) using some of the stuff in openCV.

At the back I plan to add a tail, which will help keep the robot balanced while walking. I also have a vertical pole mount, and I plan to mount my Razor 9-axis IMU up there. Since it is 3.3 volts and uses a TTL serial interface, I can hook it directly to one of the unused UARTs on the BBB. The 9-axis IMU will be used mainly as a compass, although I might eventually use it to help smooth the walking gait.

Underneath, in front of the battery, is another new board I made, which I call the power distribution board. It will take 12 volts from the power board mentioned above, and provides a 12 volt output to the bus board. It also has a 10 amp 5 volt regulator which will provide 5 volts for both the USB hub and the BBB. Ten amps is probably overkill, but the regulator is cheap, so it doesn't matter to me.

The main power switch is on the right side (first picture), at the bottom of the middle body segment, and it switches the battery directly, and controls power to the entire robot. The bus board also has its own power switch, so I can shut off power to the servos independently.

Friday, September 27, 2013

So up until a month ago, I was working on performance related aspects of Firefox OS (I'm an employee of Mozilla). The performance team did a team meetup in Toronto at the end of August, and one of the team members thought it would be cool to use this digital USB ammeter to measure current draw from our phones.

3D CAD Model
I said "I know how to do that stuff", and in short order whipped up a 3D model of a battery harness that would allow us to insert a shunt into the + line between the battery and the phone, and feed that shunt to the ammeter. I printed a couple versions of the harness on the in-office Replicator 2, and wasn't very happy with the print quality. I sold my Dimension uPrint earlier this year, so I ended up going to a commercial printing place and had it printed on a uPrint. It came out beautifully, of course, so I designed a little printed circuit board for it, and had that manufactured.

Python Ammeter Script
I wrote a simple python script to sample the data from the ammeter, and within a week I had a pretty neat setup...









Battery Harness
I demoed this setup at the work-week in Oslo at the beginning of September, and it seemed to be well-received.
After I got back from Oslo, I started work in earnest on this. We decided that the commercial ammeter didn't have the features we needed (specifically the ability to remotely disconnect and reconnect the battery line to force a phone restart), and although the company makes custom parts, having a single prototype to test with before ordering 200-300 boards wouldn't work for us. So, I sat down and started reading data sheets and ordering parts. My brother Dave Hylands (who also works for Mozilla on Firefox OS) has been an enormous help in all this.

Ammeter Schematic
As of today, I have prototyped (on a breadboard) all the parts I need to make this happen. We (Mozilla) ordered a commercial 3D printer, and once it gets here I'll be able to start cranking out models. I've designed the ammeter PCB, and have ordered the first iteration of it. I'm hoping to have enough time to populate the board before I leave for the Mozilla Summit next week, so I can bring it with me to show people (I'll be in Santa Clara).

Labels:

Wednesday, March 27, 2013

I'm Back

Hi everyone,

Its been a long time - once the funding ran out for the work on BrainBot, I got a new job working with a friend on cell phone repair automation software. That kept me insanely busy for a couple years. In January of 2013, I resigned from that company to take a position with Mozilla on their Boot 2 Geko team, working on Firefox OS, their new mobile phone operating system.

Along with this new job come regular hours, so I actually have some time to start working on robotics again. I've decided that I want to keep things simple, so I'm going to work on Roz, my Bioloid-based quad walker, which has been gathering dust on a shelf for 3+ years.

Here's a video of the last thing I did with Roz (back in December 2009), which mainly involved tweaking the gait engine to make it go really fast:


So it looks like I'm going to pick up a Raspberry PI board to control this thing. Unfortunately, I sold my 3D printer, so any physical modifications will be done with my CNC mill or other tools. I'll post more once I have something more to talk about. I don't expect to do much in terms of hardware until I get back from the next Mozilla work-week, which is at the end of April in Madrid, Spain. In the meantime, I'm doing some design work on my latest autonomous controller/brain software, which will run on the PI (in Squeak Smalltalk, of course).

Monday, May 10, 2010

Brainbot is finally Autonomous



So, its been a while since I updated here. Things have been busy, but I ran into some mechanical issues, and then had a really hard time finding a compass sensor that would work on this platform, given the off-road (and thus uneven ground) issues it runs into.

I finally solved the compass issue by buying the new Sparkfun Razor 9 DOF IMU sensor, and loaded the provided AHRS software onto it, but even that didn't work exactly right. The heading reading provided was very non-linear over the full 360 degrees. I fixed it in a very brute-force way, by building a turntable I could place Brainbot onto, and then coding a lookup table to convert IMU heading into real heading.

To the right is a graph showing the actual heading versus the IMU heading.

Here's a video showing the first autonomous run:


This is a video showing the second autonomous run, which is the same mission as the first one, but I start the robot on the "wrong" heading, and it auto-corrects (at the beginning):


This is a screenshot of the mission editor, with the mission path (from top left to bottom right), and the logged vehicle track overlaid on top of that. All the navigation right now is done using dead reckoning, with wheel encoders and the 9-axis IMU.

Now that the heading issue is (hopefully) solved, things should progress at a much more rapid pace. Next up is integrating obstacle avoidance, using the Hokuyo laser scanner, followed by visual servoing using the camera.