Another crappy SLAM development platform

Or as I call it “Floating Navigation Sensor Assembly” which really it’s just a gimbal.

Navigation unit repo

WiFi buggy repo


I’ll preface this by saying that in the past my experience with the TFmini-s for Arduino had me expecting that it could only measure as close as 30cm. But this library I ended up using for Python is so good! I was able to get as close as 1cm on the reading wow! I point this out because the purple sensor was intended for the close range (within 30cm). The difference between these two sensors is the size of the FOV of the emitted conical beam, for the purple one (VL53L0X) it’s 25 degrees vs. 3.6 degrees on the TFmini-s (more accurate farther away).

Anyway this platform is another learning project for me. At the heart of this pan-tilt assembly is an IMU (MPU-6050) the accuracy is not great but good enough for this junk thing. The whole unit pans/tilts around the center of this IMU. So the IMU is used for positioning. Which that’s so so… tilting I should have a good experience with using NED to determine levelness with the horizon. The panning is different… I don’t have feedback, there are no bumpers… so what I had done for the quad robot is to accumulate the gyro measurements but that usually has errors in the range of 1–2 degrees or more per side say 5 degrees off overall which is not great… I have a thought to use a red line/marker on the chin of the robot to center the pan/tilt platform with OpenCV. I will see how that goes.

My approach with computer vision right now is a basic blob/centroid finding and then the distance sensors are used to figure out how far away they are. Then you draw bounding boxes around those items/cube. I’m not saying this is a great approach or anything but it’s what I’m doing right now “from scratch” as in find pieces of code that does this.

The navigation unit is separate from the electronics of the robot that moves (basic two wheel servo). So the navigation unit does the thinking then via a websocket API just tells the robot to go in some direction. What’s nice about this as opposed to my quad robot is it’s a stable platform, so velocity/acceleration estimates are more accurate/no backwards subtraction from the robot walking/shaking.

Bridge between the two parts

As mentioned the two halves: navigation/robot communicate by websocket. So the robot is like an rc car that the navigation unit steers.

The tail dragger platform is nice due to its simplicity in construction and smooth motion. My environment is a pig stye apartment with crap everywhere so it’s good for non-uniform obstacle detection. Also I’m bad at algorithms/performant code so anything I make is probably trash but it’ll be fun.

I was working on trying to nail down the exact motion. You have to factor in the rate of the wheel rotation, the time it rotates and the diameter of the wheel (circumference). Then turn that into a function so you can make a string command like _mfs_10_mfe_ which translates to start/end markers for move forward command, 10 units… the units are inches say from the command perspective but they get translated to degrees rotated by the servo. I have something working but it’s not 100% with regard to accuracy/predictability. The reason for the start/end markers is I’m using a jank websocket keep alive deal and it’s bunching serial data up.


So far the main problem I’m facing is the MPU-6050 interfacing via i2c with the Raspberry Pi. It’s not reliable. So I’ve had to resort to a cringe approach where I’m using SPI to bridge a Seeeduino to a Pi. The Seeeduino is a microcontroller connected to the IMU, it is sampling it and then the Pi can call down into the Seeeduino to pull out data from the IMU. Having not worked with SPI before, this is not easy for me to do so I’m still working on it at this time.


Since trying to learn SPI would be a time sink and not one that I really want to do. I ended up squeezing in an MPU9250 instead which while problematic at least runs longer/less IO errors once it gets going. This breakout board is larger so I had to cut a piece of plastic out to get it to fit. Its center point is not exactly centered with the pan/tilt axes though unfortunately but pretty close.


If you’re interested these are the most recent (construction) videos for the mentioned parts above.



Software developer and general technology tinkerer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store