Twerk lidar robot — design and construction (part 1)

Jacob David C. Cunningham
10 min readJan 20, 2022

--

what have you brought upon this cursed land

In retrospect: I think this name may have shadow banned this project in certain places ha. It’s okay if this thing is buried, what I get out of it is cumulative learning. I don’t know how else to draw attention to “quadruped-mounted ToF sensor moves in a shaky pattern to map a portion of space in front of it and uses an IMU to build a map”. There is also probably not that much interest in building these things because they are a significant commitment of work to put together even following a guide and are a very specific niche like “cool it moves forward and back” now what?

Part 2 | Repo

Part 1 is primarily about the physical design/intent of this robot. The next part is on the rest of the “cooler” software. Specifically the actual terrain mapping and real time 3D telemetry web display.

It took me about 2 weeks to design and build this thing regarding the physical part. The software aspect is still ongoing. By the time I post this, I will have made the robot move and have the “twerk lidar” obstacle detection process motion down. There’s still so much work for the full project but you know, gotta get that ego satisfaction, post about it.

I recently left my day job (doing something else) so I have a lot of time now. I’m pretty happy but I’m also super broke, living on debt ha, not great. I’ve been down that road before and it is not a good place to be.

Anyway I’ve been pouring time into this project, by that I mean entire days back to back, like I did a 3-days-in-a-row session for the circuit aspect alone.

It’s a learning project and “cool personification” thing… I want this thing to roam around and map in my apt. It’s a precursor tech/proof of concept deal for outdoor ones particularly ocean exploring or at least a lake.

The concept

This is a gimmick, but my previous robot sucked… it was weak but the main thing is it was dumb. It had no way to create a state of the world. It just had a bump/proximity sensor (ultrasonic). So this time I wanted an IMU which I have never really used before in a real setting vs. just pulling data from it. The IMU would be used to track the robot’s state as it moves around and collects data with the single-point lidar ToF sensor.

The robot will form a “conical squiggle” sampling pattern with the single beam in order to make a bounding box in front of it, of some shape. Then that is a box to be avoided. It actually is a wide AR rectangle right now.

I drew this 4 months ago. The circular scanning pattern is a little harder to code but possible.

I am not a math person so this is not easy for me to figure out how to use an IMU. I still have not gotten it fully down as of now/time writing this. I know that servos by default don’t have positional feedback, it’s an estimate. The IMU is the main thing here, you hopefully can track pretty well the angle turned/current orientation/etc… along with that known sensing pyramid area.

Anyway I finally had the time to build it. I went with a Teensy because I heard they were more powerful and I wanted to use one. Thankfully the Teensyduino environment takes care of using libraries. I had problems at first as I was trying to use them by cloning them into a folder but the Teensyduino editor did not know about the folder, so I was failing at that basic step but I got help. From the IMU (MPU-9250 Bolderflight) code author no less, pretty cool. Shoulders of giants as they say.

In retrospect do you need a Teensy for this project? I don’t think so… I mean I’m not sampling something 600 million times a second and the Teensy can go even faster than that. I was actually concerned when I felt it and it was warm ha but that’s normal apparently. I did take advantage of the many IO of the Teensy 4.0 for this project. Namely my laziness with the i2c bus mapping.

After I got the basic library compiling issue sorted the rest was pretty straight forward to do physically. I had to figure out how to wire everything. With i2c you can have one bus and multiple devices connected to the same wires. But since I didn’t want to deal with that right now, I just used the two buses on the Teensy 4.0, one per sensor. Then I use one of the Tx/Rx pairs (#5) on this board to talk with the ESP-01.

Then I just had to spend the time designing the body and printing it. It takes a long time to print, for reference, the main chasis (battery halves glue onto) takes 3 hrs to print, each leq guadrant takes 3.5 hrs, etc… then I do a sweep test on a servo to make sure it’s not stripped/broken. Center it and assemble the leg.

I am aware it is a terrible design to move many/all servos to do scanning when a dedicated pan/tilt platform could do it, but this robot is also supposed to be “simple”. The pan/tilt way aside from being more weight, I would want to use an IMU on it unless the servos had positional feedback. Even then I think I would prefer an IMU on the pan/tilt bed. Which means some kind of slip ring hopefully small enough. I still am considering doing this at some point.

Design

This is SketchUp 2017 free version

For design, I modeled all the actual parts eg. the servo/sensors. They were very rough dimensions, mostly the major dimensions for fit. Then the plan was to base it around the battery, so that’s why it looks the way it does. Then I just tried to align the sensors on the same planes, except the ToF is farther forward than the IMU (middle of frame).

It took me a bit to design the legs because I had to imagine how they worked. I mentioned I’m stubborn/ego but I try to figure things out on my own before getting help. Not always the case though like my flagged questions on Stack Overflow.

Actual build

mmm wire management

Yeah it does not look the same as my initial sketch lol… particularly the way the battery is positioned. The battery is underneath, I had to cut the battery in half (trick I read on a Reddit post) as this battery has a protective circuit on it so it’s a bit longer.

I was waiting for the last leg parts to print at this time.

This design is not ideal. I initially was supposed to do math/evaluate it with regard to FBD… I just used some rough estimates, namely the max torque from the servo based on the moment arm (horizontal leg length). So the legs try to be less than 3" at worst… ideally it’s 1" since the robot is just over 10oz and a fourth of that is 2.5oz… the servo has a curved torque graph which I got from this video.

Another problem is the legs can’t move 100% which is not a fully bad thing either… if you were to get up from a flat stance the current consumption spikes beyond the budget or very near it eg. 3 amps. In that case you would want to move a single servo at a time or less in general. This is assuming the legs are just slumped/servo’s sitting flat on the ground.

The main board is kind of janky, it’s just floating in the air. I will design a support thing (already done) which happens to also house the power supply, a custom jobby mEZD41503A-A single cell to 5V @ 3A that is 10x the cost of a cheap DC-DC step up converter, mostly it’s the right one for the job.

Oh yeah this is the schematic, lol sorry for the crayon circuit.

The sensors connect by color/wire

Update from above

I ended up mounting the beast step-up converter I got externally like this:

The crappy board has a battery voltage tap as well through a voltage divider for the Teensy’s 3.3V max pin limit

I am not at the stage yet where I can get a board designed/send it to OSH Park. So I just use protoboards at this time. I have to learn Fusion 360 next before something like KiCad.

Can see how I tried to align the IMU with the ToF sensor to reduce math/make assumptions easier. I have no idea right now if there is something immediately dumb with what I have done with this IMU placement regarding interference. Also these IMUs are generally garbage like you buy them and 50% don’t work out of the box (ex. gravity is 7 fixed), the rest you have to calibrate/adjust.

I made everything socketed so I could just pull parts if needed or replace. The servos depending on which one are not as easy to replace. Particularly the “servo boot” is glued to the bottom of the servos.

I will add a voltage tap on the battery to an analog pin so I can tell how it’s doing.

Walking

Was wired as the initial 2A max DC-DC converter I had couldn’t power it. The walking gait needs work, I tried to imagine it but what I came up with is bad.

Yeah… I’m stubborn. I wanted to figure it out on my own, I could just look up the ideal gait for a quad and use it (I’m going to). Anyway mine sucks but it does move. I will improve this over time.

Side note: I prefer insect type robots more than the dog ones. They just seem cooler, “more robotic”.

The way it moves right now, if it wants to lift a leg up, it has to tilt away from that leg, so the opposite corner goes up and the body tilts that way. I’m not sure right now if that is how quads actually walk. I have an IMU so theoretically I should be able to use gravity to assist in walking where you fall towards where you’re stepping.

Actual obstacle detection “twerking”

Motions slowed down for clarity note the S-pattern

I described above how this thing works but yeah, the robot starts somewhere in the world. It does an initial scan straight ahead. This sensor is short range, so it’s 6 feet and under. Unlike this more expensive one I bought TFmini-s (4x as much as this one) where it seems to be ideally for 2 feet and above. The robot can keep track of its acceleration/velocity and orientation… combine all that you should be able to track it pretty well… I think I have to incorporate magnetometer but not sure yet it’s inside an apt.

Basic bare minimum obstacle detection is that “is something big in front of me”, get out of the way. Which is pretty weak/not much better than the ultrasonic robot.

The ideal case is you keep that detected object in memory as you go along and track the robot’s progress/where those things are in relation to the robot.

These are what part 2 will be about as well as telemetry described below.

Mapping and real time telemetry

This has an ESP-01 that you can connect to via a websocket, this is how you get real time telemetry from the ToF/IMU/servo positions and battery voltage.

Further I can just export the SketchUp model as a glTF and import it into ThreeJS to then use the real time telemetry to show what the robot is doing. I’m not a 3D guy though so the ThreeJS part is slow at the moment.

Disclaimer

Everything for this project is there. It requires know how but this is not really meant to be reproduced because it is time consuming/not friendly/requires work. The main issue is the circuit board itself, that requires some soldering.

Closing thoughts for now

In the end this is a learning project, I mean this thing is a piece of crap/a toy but I learned a lot from it. Knowledge I will take with me to the next phase. At the end of a project you ask yourself “why did I bother?” but it is about the experience. I had fun making this thing.

There are still a lot of software things I will do. I want to make a “true” Windows GUI application using some library eg. QT vs. using a browser wrapper. I mean yeah it’s way faster to just load a React App in ElectronJS but still I have not made a true native desktop app before. Another project I have to do is build/deploy a Pinephone app… I have to do that before I can get the Pro as I have the first one.

So yeah… just learning and keeps me occupied.

This is cheap/not supposed to do it but a future version of this is using computer vision eg. a camera. I have Pi Zero 2 already for it. Although that, CV is so hard I would probably just use a dumb box like the SLAM crappy project that I have not finished yet.

Last tidbit: I did think about how this thing could have an induction charger on the bottom and find home/charge itself.

The gif above is a real video lol. I picked a random free YT song. Another item to be filed under “cringe moments”.

--

--