Twerk Lidar Robot — update (part 2)

Jacob David C. Cunningham
3 min readJun 21, 2022

Part 1 | Repo

It has this weird issue of tipping backwards to the right so it used to carry a weight in the front

I will start a new job next week and it’s just dawning on me how I will lose a lot of time. I will be lucky to have a couple hours to myself on weekdays. Anyway I have been working on this project still here and there; I have made some progress towards it. The main updates are mapping and the 3D model import into a browser via ThreeJS/glTF.

I have been tracking my progress on Hackaday

Navigation

So, I initially planned on using the IMU to determine the pitch value/angle of the ToF sensor but that proved to be problematic. So now it just uses fixed values measured externally with a camera/SketchUp. The sweep on the other hand was “close”, it was off by a couple of degrees out of 20–30 but I was getting something. So that is using values from the IMU. When plotted it looked sensible.

You can see an example depth probe/mapping below.

Each square on the grid (right) is 10" sq

It is rough because the ToF is mounted on the edge of the body and it sweeps around/has an arc to be removed/corrected. But these values are pulled from the sensor as seen below.

In the end I am not using this (yet) since it requires more math (ray trace collision). Right now I’m doing a basic thing of if you hit anything/some threshold it counts.

Then you tie that with the world map which is basic, and the obstacles hit are cubes like below (still needs work).

The walking is messed up because I replaced all of the servos. After a long while of testing/screwing around with the robot a few finally died. The gaits are manually programmed so it is not easy to get it back to the original “good state”. That’s another improvement is to actually use inverse kinematics. I’m currently using a quad spider gait from Regis Hsu’s project. I just watched a few of his videos on how his robot moved and then I programmed the gaits.

3D import

This actually did not take that long to prove out, although it is not complete. What I have to do is import the moving parts individually and then bind to them/move them. That takes work.

But you can see above I got it done. I imported the whole robot, I had to export it as glb. So I just have to clean up the UI, it has some functionality right now like manual disconnect/telemetry from the robot (plots the squares above). I was not able to stream the mesh data (it’s a lot over 1,500 points). So all of that would have to happen on the robot.

Overall I need to hone in on the individual parts more like one thing I need to do at some point is deal with an IMU on its own, attach it to something like a Pi or ESP and broadcast the data but be able to crunch the numbers in real time/separate from the robot to work out the math. Deal with calibrating mag/accel values to make it more accurate and really understand the dead reckoning stuff… I think I got it to work but in some ways it does not work correctly (rotating the robot/watching axes, they don’t match, only thing that works is down regarding NED).

Videos

These are some longer/explanation videos. More on the channel that were released over the months since January (first post).

Closing thoughts

This project is not done. I’ll work on it here and there but yeah, it’s a time consuming project and needs a code rewrite.

--

--