March 2026: The Robot Edition

Hello and welcome to another Burf Update. So, as usual, my ability to focus on a single thing has gone Pete Tong. In the last edition, I was set on getting the Platform out there and seeing if it has legs. Instead, I purchased some new toys!

iPAL Humanoid Companion Robot
The first interesting addition to my robot collection is the iPAL robot — slightly taller than a Sanbot, it’s a super cute, child-like robot. It has working arms, though the fingers aren’t powered. It runs Android 7 and they’ve done a solid job with the software — the head tracking is a nice touch and the voice is surprisingly cute. I believe they still sell them, and it even comes with a ROS driver, which I’m looking forward to exploring.

Unmanned Ground Vehicle (UGV)
This large, monster truck-like robot was an interesting eBay find — clearly someone put serious money into building it. It runs the Herelink system with a CubePilot Cube Black brain, and comes equipped with GPS (Here2), Lidar, and a camera system. From the Android-powered controller you can send it GPS waypoints and off it goes autonomously. It’s a new area of robotics for me and I’m looking forward to spending more time with it.

PiBob Desktop Robot
I also built a desktop robot called PiBob. The aim was to design something fun that was as cheap and simple to build as possible — and it was great fun to make. It’s powered by a Raspberry Pi 3 (had one lying around) and a servo shield, using affordable servos that keep the total build cost under £100 excluding the Pi. It’s still early stages, but I’m hoping to expand it into a larger, moving version.


The Platform
Not much progress on the Platform this month — I’ve been trying to find the headspace to properly define the MVP and the problem it actually solves. The current robot support list, outside of the ROS 2 driver, is fairly limited and a little dated. Once I’ve got clarity on the direction, I can move it forward with more purpose.


Tarot FY680 Heavy Lift Autonomous Drone
I bought this drone a few months back, but have never tried to fly it — out of pure fear. If this thing hits you or falls out of the sky, you’re going to get seriously hurt. It’s 1 meter tip to tip and can lift up to 7 kg. I’ve just completed my FlyerId and OperatorId, so I can now legally fly it, and my first flight was a brave 1 meter off the ground.

Burf Robotics
Not much progress this month — I’ve had a couple of chats with other companies, but I still haven’t actively gone out and marketed it. People have been coming to me, which is encouraging, but I need to change that dynamic.

ROS2
I’m continuously using ROS 2 but need dedicated time to expand my knowledge. This month I’ve been more focused on OLO consultancy work, which has come at the expense of learning. I need to prioritise this.

Brilliant
Continuing daily and at 479 days now. Like ROS 2, I need to push myself out of my comfort zone with it — or decide to drop it.

Hack24
Parked until I have time to properly test it.

Meta Quest Tracking
I spent a little time on this to see if it works with PiBob. The results were underwhelming, so it needs more attention before it’s useful.

InMoov Humanoid Robot
Nothing to report this month — and honestly, writing this blog has made it clear just how much I’ve lost focus. That needs to change.

Plan for 2026
The plan isn’t exactly going to plan — a lot of the goals above haven’t moved forward, but I did redirect my time towards some shiny new things instead.

  • Strengthen Burf Robotics and its offerings
    Target: Gain at least one additional customer. (Progressing)
  • Launch Burf Platform as an MVP
    Target: Deliver an MVP with user management, robot management, and support for a range of robot platforms. (Done)
  • Continue learning ROS2
    Target: Deepen my understanding of Nav2, MoveIt, and integrating AI with ROS2. (Needs more progress)
  • Inmoov Robots
    Target: Focus more on interactive demonstrations rather than building additional robots. I’d still like to develop a mobile InMoov platform. (Needs more progress)
  • Continue Brilliant learning
    (Needs more progress)
  • Build my own robot product
    Target: Not fully committed yet, but PiBob is a proof of concept and a step in that direction. (PiBob POC built)

Let’s be honest — I failed at formalising a plan this month. I’m going to sit with this blog post for a while and figure out what actually matters to me.

February 2026: The Platform Edition

Hello and welcome to another Burf Update. I have been super busy trying to get the first version of The Platform out to see if the idea has legs. Hopefully by next month I should know if it does.

The Platform
Formerly known as Burf Platform—a name that didn’t make much sense—the Platform is a robotics system designed to make it easy for users to teleoperate their robots. It is launching with ROS 2 support and also supports a range of other robots, including the Sanbot Elf, Double 2, and even the InMoov humanoid robot.

The goal is to contribute something meaningful to the field of robotics. There are plans to expand support to additional robot types, with a particular focus on helping hobbyists, supporting drones, and enabling data collection. A fully functional API is already available.

Burf Robotics
The goal is to continue to build the brand and client base. I have already had some positive talks with other robotics companies this month. I still have the lust to build my own robot, however The Platform is my current focus.

ROS2
I am continuously using ROS 2; however I need some focus time on expanding my knowledge. I have been focused on The Platform, and that needed limited ROS 2 functionality. I need to get back to the Udemy courses that I had started a while back.

I have started work on a PiCar ROS 2 driver (Ackermann-steering), and I have updated my TurtleBot2 to ROS 2, and it is now running a Jetson Nano instead of the Jetson TK1. I also purchased a VIAM robotic rover to try out.

Brilliant
Continuing daily and at 443 days now. Like ROS 2, I need to take myself out of my comfort zone.

Hack24
A blast from the past, Hack24 was one of my favorite game projects that I built over 10 years ago. Well, I decided to chuck the old Objective-C OpenGL code base at Claude and see if it could rewrite it in Swift using SceneKit, and I have to say, it did a good job. The server has been rewritten in Swift Vapor, as I need to focus on learning Swift again. I hope to release it pretty soon as a bit of fun.

Meta Quest Tracking
I have also started working on a Meta Quest 3 app to track body movement. This is to replace the Apple Vision Pro demo I had controlling the Inmoov Humanoid Robot, as I do not have access to the AVP anymore.

InMoov Humanoid Robot
I did have a brief play with one of my InMoov’s recently. The SD card in one of the Raspberry Pi’s had died, so I fixed that and ordered it a sound card and mic. I hope to use this with the tracking app very soon

Plan for 2026
The last month has been super busy, so yet again I find myself running before I have properly thought about the year’s goals. However, let’s see if there is any progress from last month.

  • Strengthen Burf Robotics and its offerings
    Target: Gain at least one additional customer. (Progressing)
  • Launch Burf Platform as an MVP
    Target: Deliver an MVP with user management, robot management, and support for a range of robot platforms. (Done)
  • Continue learning ROS2
    Target: Deepen my understanding of Nav2, MoveIt, and integrating AI with ROS2. (Needs more progress)
  • Inmoov Robots
    Target: Focus more on interactive demonstrations rather than building additional robots. I would still like to develop a mobile InMoov platform. (Progressing)
  • Continue Brilliant learning
  • Build my own robot product
    I am not fully committed to this yet; however, I did begin building my own robotic arm. Developing my own robot could be a valuable learning experience, particularly by designing and implementing the ROS2 drivers around custom hardware. This would focus on a robotic system with arms rather than a mobile platform. (Needs refining)

Hopefully, next month’s edition, I will actually finish formalising a plan.

January 2026: Happy New Year Edition

Hello and welcome to another Burf Update. I hope everyone had an amazing Christmas and an epic New Year. The Christmas period rushed through at lightning speed, but was enjoyable. In this edition, I hope to map out what I plan to do in 2026.

Burf Robotics
Burf Robotics is the key focus for 2026 in that I want to make sure it is properly operating as a company now that it has its first customer (OLO Robotics), which moves it from an idea to a reality. I am also working on its first product Burf Platform.

Burf Platform
Over the festive period, I spent much of my spare time working on the platform. The current vision remains a remote teleoperation platform that supports multiple types of robots. So far, I have developed drivers for:

  • Sanbot Elf
  • Afobot
  • ROS2 Mobile robots
  • Double 2 Robot

The ROS2 driver is the one I am most excited about, as it has the potential to open access to a wide range of robots. My aim is to make integrating the driver as simple as possible, mapping the camera feed (via WebRTC), sensors, and teleoperation topics directly into the platform. The next step is to set up hosting for the platform.

ANYCUBIC Photon Mono 4
So on a different note, I got a resin printer for Christmas, which was awesome. I am super impressed with the printing quality; however, it is a lot messier and smellier than normal 3D printing. I hope it will expand what I am able to print for my projects.

ROS2
I have continued my learning on ROS2, and I am really enjoying it. I still have an awful lot to learn. It feels the more I understand, the more I need to learn. However, for my needs, I am making good progress. I need to get back to the Inmoov ROS2 driver (Sim) I updated and then work on a hardware driver for that. I also want to take a look at Foxglove, which allows you to visualise all of your ROS data.

Brilliant
Something I had not mentioned for a while is that I am still doing Brilliant.org daily and am up to over 400 days now. I do need to actually focus on expanding my learning here instead of going for the easy topics.

Reviewing 2025
At the start of 2025, I set 3 high-level goals:

  • MyRobotLab + The Inmoov + LLM
  • ROS2 and the Rover Robot
  • Gwiz and predictive driving

The year didn’t go as planned due to my redundancy, but I ended up doing more robotics than in any previous year. This led to the formation of Burf Robotics, and I successfully met my MyRobotLab and ROS2 goals. The Gwiz was always intended as a fun side project, so I’m not concerned about that. My conservatory has now become a dedicated robotics space, and overall, I feel I’ve gained a significant amount of knowledge over the past year.

Plan for 2026
I haven’t fully stepped back to define the plan in its entirety yet; however, I do have several clear areas of focus for 2026:

  • Strengthen Burf Robotics and its offerings
    Target: Gain at least one additional customer.
  • Launch Burf Platform as an MVP
    Target: Deliver an MVP with user management, robot management, and support for a range of robot platforms.
  • Continue learning ROS2
    Target: Deepen my understanding of Nav2, MoveIt, and integrating AI with ROS2.
  • Inmoov Robots
    Target: Focus more on interactive demonstrations rather than building additional robots. I would still like to develop a mobile InMoov platform.
  • Continue Brilliant learning
  • Build my own robot product
    I am not fully committed to this yet; however, I did begin building my own robotic arm. Developing my own robot could be a valuable learning experience, particularly by designing and implementing the ROS2 drivers around custom hardware. This would focus on a robotic system with arms rather than a mobile platform.

In my next update, I hope to solidify the plan.

September 2025: Head Down Edition

Hello and welcome to another Burf Update, I struggled for a bit to pick a title for this edition due to the range of robotic things I have done over the last month. I have certainly spent a lot of time doing robotics.

Gwiz Update
I have only spent a few hours on the Gwiz, I am trying to work out which batteries are still iffy. I have also installed a battery meter and a 10″ Android headunit which I hope to be able to deploy my own apps to. This allow me to record driving data etc.

Afobot
Back when I bought my 17 Sanbot Elf robots, I was also given 40 (ish) Afobot robots. These are little desktop based things with a screen that can move side to side and up and down. They run Android 7 and initial investigations seemed like they had no SDK and no real use. However with the help of ChatGPT, I managed to write some code to control the motors. Next plan is to connect it to my websockets project and support teleop.

ROS2, the Rover Robot and the Robotic Arms
So I have continued to learn ROS2, overcoming some headaches etc. I still need to learn more, however the knowledge is slowly sinking in. One awesome thing I did get going was simulating the Inmoov robot in Rviz2 and Gazebo. This was super cool but nearly broke my brain. A lot more work needs to be done but its a positive start.

I also built a setup so that I could place a LEGO brick and my robotic arm would pick it up (myCobot 280)

Inmoov Update
As for the Inmoov humanoid robots, I am preparing 2 of them for a show this Month. In preparing Robob, its arm snapped off which I have now rebuilt. Robob and Reggie both run off a Raspberry Pi 5 now and I am working on making them more portable. I have not done any more work on the 3rd robot, but do plan to.

Burf.co
So, I have turned off the Burf.co Search Engine for the moment to help me focus on the future which at the moment is robotics. I want to get my own teleop robotic service going and so need to focus. The site has also moved to faster hosting (Thanks Adrian).

Reviewing the Plan
I am super keen to focus on the plan I set out last month, so lets see if I have helped towards it, or as usual got sidetracked.

Create a Sanbot Remote Control App

  • Design a controller that streams all sensor data from the Sanbot to a user interface, allowing remote movement and activation of functions.

So I have created a prototype Android app and Python script that streams sensor data including camera to a server from a Sanbot. I can also move the robot and make it speak. Next I want to be able to support multiple robots and control head/arm movement.

Continue to deepen my ROS2 knowledge

  • Fix bugs around MyCobot280/xArm hardware, so that API works well
  • Finish self-driving course on Udemy
  • Understand SLAM and mapping better so that a robot can travel between rooms
  • Add IMU support

I have continued to increase my ROS2 knowledge and am progressing with the MyCobot Arm and the self driving course. For work I have written a cool script that can control the robotic arm to pickup a LEGO brick.

Continue Developing the InMoov Humanoids

  • Have an active one for improving Vision and LLM use. Think about using a Neural Net to give personality. Think about gestures.
  • Create a mobile Inmoov robot, even if its remote
  • Consider merging this with the Sanbot remote control app

At the moment, I am focusing on getting the Inmoov robots ready for a show, however in doing so I am improving them. With the ROS2 Inmoov work happening, this only improves my likelihood of doing more ROS2 and Inmoov work.

Have Fun with the G-Wiz

  • Design a remote-control system (concept-only for now)
  • Add sensors and record journey data
  • Use that data in a self-driving AI test
  • Explore ROS integration for car control

Some small steps made by adding the Android headunit, I have lots of ideas here but mainly focusing on the fun side.

It generally looks like I have stayed on track which is good news, I hope the October update shows equal or even more progress.

August 2025: The ROS2 Edition


Hello and welcome to another Burf update! Sadly, I didn’t get a chance to publish a July post — it’s been full throttle on the robotics front. This year is really shaping up to be a robotics-heavy one, and I’m loving it.

MTC: Robotics and Automation Event
We attended the event, and the second InMoov robot did well playing Rock, Paper, Scissors. A few lessons were learned — notably, face detection struggled with different skin tones and people wearing glasses. I had (wrongly) assumed that the maturity of OpenCV would have eliminated these kinds of biases by now.

As for the Sanbot robot — it didn’t get used much, so I need to think harder about a compelling use case for it.

Gwiz Update
Back in May, the G-Wiz passed its MOT, but unfortunately, the batteries were shot. Luckily, a kind soul from the G-Wiz forum donated some replacements (seriously nice of them), and now the car is up and running. The plan — once I clear the next few tasks — is to start recording data and begin training an AI to drive it.

ROS2, the Rover Robot and the Robotic Arms
This month has been packed with ROS2 learning. I’ve completed the ROS2 Manipulator course on Udemy (and I’m nearly done with the self-driving course). I’ve successfully controlled three different robotic arms using MoveIt, Gazebo, and RViz.

Massive thanks to Compsoft for giving me time to dive deep into ROS2 — they had a cool idea involving it, which gave me the perfect excuse to learn. I still have plenty to master, but I’m feeling far more confident than I did a few months ago.

AI
ChatGPT has been incredibly helpful throughout the ROS2 journey. Sure, it’s made its share of mistakes — looping, incorrect suggestions, etc. — but it’s been a great learning tool. The bigger vision is to use AI to simplify robotic arm control, lowering the barrier to entry for others.

Burf.co
The main website has been updated to reflect my focus on robotics. The new goal is to offer consultancy and robot rentals — an exciting next step in the Burf journey!

Inmoov humanoid 3
Yes, somehow I’ve ended up with a third InMoov robot — rescued just before it was scrapped. It still needs a lot of work, but I’ve managed to piece it back together and source enough parts to upgrade the others. This one now has LCD eyes, which adds a fun new dimension.

The Plan
At the start of this year, I set myself three high-level goals:

  • MyRobotLab + The Inmoov + LLM
    (Rock paper Scissors demo)
  • ROS2 and the Rover Robot
    (ROS2-powered rover and MoveIt-controlled robotic arms)
  • Gwiz and predictive driving
    (Train an AI to drive the G-Wiz using collected data)

So far, I’ve successfully achieved the first two. The third — predictive driving with the G-Wiz — hasn’t really taken off yet, though it’s still on the roadmap.

In the meantime, I’ve also acquired a whole fleet of Sanbot Elf robots and rebranded Burf.co as a robotics-focused website.

Thanks to changes at work, I’m now spending a lot more time on robotics. That’s not only helped me hit my previous goals faster, but it’s also inspired and refined the next set of objectives I’m working towards

New Goals
These goals aren’t in priority order and will naturally evolve over time, but they represent the next phase in my robotics journey:

  • Create a Sanbot Remote Control App
    Design a controller that streams all sensor data from the Sanbot to a user interface, allowing remote movement and activation of functions.
    — Critical for the robotics rental business idea.
  • Continue to deepen my ROS2 knowledge — this remains vital for both work and personal robotics development.
    — Ongoing learning improves real-world application and problem-solving.
    • Fix bugs around MyCobot280/xArm hardware, so that API works well
    • Finish self-driving course on Udemy
    • Understand SLAM and mapping better so that a robot can travel between rooms
    • Add IMU support
  • Continue Developing the InMoov Humanoids
    These are great for showcasing my skills and drawing crowds at events.
    • Have an active one for improving Vision and LLM use. Think about using a Neural Net to give personality. Think about gestures.
    • Create a mobile Inmoov robot, even if its remote
    • Consider merging this with the Sanbot remote control app
  • Have Fun with the G-Wiz
    This one’s for fun, learning, and experimentation.
    • Design a remote-control system (concept-only for now)
    • Add sensors and record journey data
    • Use that data in a self-driving AI test
    • Explore ROS integration for car control

These are some pretty epic goals, and definitely enough to keep me both challenged and inspired. Each one strengthens my experience in robotics across multiple fronts — hardware, AI, mobility, UX, and control systems.

I’ll need to keep an eye on context switching to ensure I stay focused enough on one thing at a time to make meaningful progress.

February 2025: Igniting the fire!

Hello and welcome to another Burf Update, in this update we have an exciting title! So what I have I been doing?

Burf.co Blog
So I have spent some time updating the content of the blog in hopes that it will focus me on the future which for me is robotics. I hope to re-ignite my Facebook page and spread my news a little further (when there is something useful to say). I have added an AI page and broken down my robotics projects into technology (LEGO, VEX, Inmoov, and ROS).

It does help me realise that I have still achieved some cool things, not as much as I hoped but progress is continuing. I still have some content to add, so this is a WIP.

MyRobotLab + The Inmoov + LLM = InMoov Remote
So one of my goals was to understand MyRobotLab more and be able to write my own external Python scripts to control the Inmoov Humanoid Robot and the LLM (Ollama) that it uses. It’s still a work in progress but I have made a working demo which can be found here:

Features so far:

  • Consume webcam feed and apply YOLO11 and/or Hand/Face detection
  • Stop and start the robot listening
  • Can get the robot to speak, or respond to a text input (e.g answer a question)
  • Lists all responses back from MRL
  • Can ask the LLM for a response
  • Maps all of the services to JSON so that you can get things like a list of gestures.

Next, I want to put my own LLM layer in so that I can bypass MyRobotLab chatbot. This will allow me to add a database connection so I can record responses to review plus add a memory of sorts.

ROS2 and the Rover Robot
Good news on that front also, I have continued my ROS2 course and I have fixed the major issues with the Rover Robot when trying to get it to use NAV2 and SLAM. It was all to do with the CostMap and inflation_radius. Basically, the obstacles it detected, it inflated their area too much so that it could not plot a route around them.

The aim is to complete the ROS2 course (another course) I am doing on Udemy, refine the Rover Robot so it works a lot better using SLAM and NAV2, and then build a much better version that’s at least half-human-size.

Gwiz and predictive driving
Progress with the Gwiz has sadly stalled at the moment. First, it was the cold weather killing the batteries and now someone has smashed the windowscreen. Autoglass has tried to replace it however it has turned into a bit of a custom job as there are not many Gwizs left on the road.

AI
All of the above heavily realises on AI in some form or another. I now use ChatGTP and other LLMs on a daily basis to help me achieve tasks, at work, I am now training models across data sets to detect trends, anomalies, and insights. It is quite extraordinary how fast it has entered a large part of my life. There is no goal attached to this heading, just a reflection.

Conclusion
Not sure if conclusion is the right word, but focusing on the 3 areas above is at least a good start for me on becoming more focused. Each one could take me into an exciting area, it’s just coming up with the use case and sharing the knowledge.

September 2024: The Reboot Edition

Hello and welcome to another Burf Update, I feel this is the re-start of some fun times ahead :). With most people, August was mainly written off due to family holidays (and canceled ones), etc however, now that is over, it’s time to get my head down and focus!

The Elephant has left the room
If you read my last blog post, I wasn’t in the best place however, I honestly feel like I am turning a corner and feel pretty positive about the future 🙂

Burf.co
Since I put Burf.co back to its old state (11th July), it has had 1,186,315 searches which is quite bonkers! Now a lot of this will be bots etc, but it’s still quite an interesting result.

Inmoov Robot
So, I have not done much in fixing my Inmoov robot, but I randomly found another in Wales and have started fixing that, this one will mounted to a mobile platform :). I feel super lucky to find one. I have also dug out the electric wheelchair to pair with the new robot, or at least get the platform working. If you look at the picture you can see the fresh white printed parts which are the bits I have fixed so far.

Camper Van
The camper van had a charging issue which I have managed to fix. The main goal here is to use it regularly :). I also fitted a split charger to the van, I feel like this has some good possibilities.

Sinclair C5
My Sinclair C5 had a meltdown at the same time the Harley blew up (expensive) however, I have also managed to re-wire that and it now works and should be a lot more reliable 🙂

Electric Motorbike
Randomly, I dug out the electric motorbike this weekend, did some welding, and printed a new 3D mount for it. I hope to test it this week

Maths / Brilliant App
I am planning to come back to this soon. I have just started doing a Math diploma I bought last year. I am hoping to get more focus.

Rover Robot / ROS2
I have failed on this because of the personal life issues that have had gone on with the ex (a long story), and the sadly canceled holiday to Vegas (dream holiday) however there have been some silver linings that have put me in a good headspace. I am thankful for this and hopeful for the future.

A5 Laser Engraver
When I went to get the Inmoov Robot from Wales, the epic guy also sold me an A5 ATOMSTACK Laser engraver with software that has been super easy to use. I have managed to cut shapes from this in wood.

Let’s revisit the goals

Life Goals
I originally wanted to get to 85-90KG and I am at 80KG and in good shape. I am proud of my progress here. I would say my health goals are pretty much achieved. I would like to continue running which was going well and doing the gym with Alfie. At some point, I am going to focus on doing a muscle-up. I also carried on being a school governor (barely) and I now have a camper van which relaxes the truck. The loft has lights and I have a large shed.

Education Goals
I finished my MBA and still try to keep on track of new content and the Slack channel. I still try and join a few sessions a month and have been asked to do a session on robotics. I would like to make sure I do this video. I did complete the self-driving course and I have done more Maths this year than most. I would like to do some more Maths this year, however again a pretty good result.

Projects
So the main one was to fix the Inmoov robot which I did do, I also learned the software (MRL) and feel quite happy with the progress I have made. I would like to be more active on the Discord channel and Facebook ground but again, I have achieved a lot. I had parked the electric motorbike project when I hit a big issue however, I am going to give the a quick punt again. I haven’t done a lot with Burf.co however I am not that worried. I did get the mobility scooter and electric wheelchair working via remote control.

Goals for the end of the year
I feel like I have done pretty well so far, I have gained a few more skills (welding, 3D modeling, etc) however I would like to bullet point some top-level items.

  • Stay around 80-82KG (Life)
  • Camp at least 2 more times this year in the VW (Life)
  • Finish the YouTube robotics series of videos (Robotics / Education)
  • Stay on top of being a governor (Life)
  • Do robotic video for MBA course (Robotics)
  • Complete my course on mental health (Education)
  • Re-start the maths learning (Education)

I think that’s pretty easy to achieve, I could keep adding but I feel that does not help me focus on what I want to do.

April 2024: The Brief Edition.

Hello and welcome to another Burf update, I haven’t actually thought of a title for this as until I stop and think, I am not really sure what I have done in April.

The Elephant in the room

So the last time I wrote a post, I was super excited about a show (which went super well), I had done tons of work on the Inmoov robot, and Compsoft had received some Apple Vision Pro’s. Exciting times!!!

Sadly a few personnel things were also going on, and still are. I have sadly split up with my partner which has made the last month a real struggle. I know, everyone has been there, life is about ups and downs, and at the time I thought I was doing the right thing for myself, however, I am not quite sure now. Life has been pretty tough recently, however, I am lucky that I have amazing friends and family looking out for me.

Robotic and Automation Show

We smashed it, we could not have asked for a better show. Our stand was super popular and everyone loved our tech. We were told many times we had the most engaging stand. We got to chat with some outstanding people and make some new contacts. We are not set to do another show in May 🙂

Inmoov Robot

Since the show, I haven’t done a great deal on the robot. I have replaced its neck as it was not working very well. I have printed a new arm/hand but have not had time to build it and I have sped it up for the Apple Vision Pro demo. Oh and fixed the other arm. Once I get my mojo back I hope to continue improving it.

Rover Robot / ROS2

I have started designing and building my own robot, and I have only made the briefest progress however this is something I want to achieve this year, ideally before July. I am following some rough guides on it so that I have some sort of direction. It is very similar to:
https://articulatedrobotics.xyz/tutorials/mobile-robot/project-overview/

It will run ROS2, SLAM and will have Lidar and a vision system. I hope to drive it outside remotely. This will help me learn more about robotics and ROS2.

ROS2 mentorship

As I mentioned last month I got invited into a ROS2/Robotics mentorship program however I have just not been in the right place mentally to keep up with it. Sad really but I have to be honest with myself. The classes happen at very odd times so it’s quite easy to miss them.

Brilliant App

A long-running theme of mine is to improve my Maths as I need it for Robotics, I have tried a few things out there however the one that seems to keep my brain moving (so far) is the Brilliant App, it’s really good but a little expensive. I am sure I will cave and buy it as it seems to keep me engaged.

Shed

Due to my partner and I splitting up, I had to get stuff out of her garage. This was a lot of LEGO, which is now in my new 6×9 Shed. Thanks to my dad for buying it for me for my birthday 🙂

Unity

At work, I have been looking into Unity, Spatial computing with the Apple Vision Pro, and digital twins. The plan is to try and use Unity to move a 3D model of the blue robotic arm in Mixed reality while it mimics it in real life. I have made some progress but I haven’t used Unity for years. Unity seems to have made the decision to create a new framework for the Apple Vision Pro (PolySpatial), instead of adding the device to the Mixed Reality template which supports Quest, Hololens, and Magic Leap. You also need a pro license to use it (£1800py).

Goals

I know I have a list of goals defined and I am on track to smash them however my focus at the moment is to get my brain firing up, and get the MOJO back to how I used to be at the peak of the O2 Labs / LEGO times. Those were fun times and I really was doing some epic things.

March 2024: Robot Edition

Hello and welcome to another Burf Update, this one focuses on ROBOTS and the Apple Vision Pro.  I know, you’re probably thinking, hang on they all feature robots?  What’s different?  Well at work, we received 2 Apple Vision Pros to play with, these $3500 devices are not in the UK yet, but the boss managed to get 2.  These super cool devices allow you to do spatial computing (had a go, it really is a game changer).

So when they arrived, my company (Compsoft Creative) wanted to do something cool with them, so after a brief chat, we decided we would use them to control my Inmoov Robot.

So after a 3-day intense workshop trying to get the head/hand/finger data from the Apple Vision Pro (Good work Dave), to control the robot, we decided to take it to the next level!

So we are going to take the Inmoov robot to the Robotic and Automation Show at the NEC next week.

The Rush

As you can see in the videos above, my Inmoov robot is a bit rough, it’s had a very hard life. So I decided it was time to make it look better before the show (which is in about 2 weeks). Work was kind enough to give me time to do it as they needed the robot for the show and there was a lot of work to do.

So what did I manage to do:

  • New head with a new camera and built-in speaker
  • Completely rewired the robot, and all connecting blocks removed
  • New chest plates, sensors, etc
  • New back plates
  • Fingertips added
  • Ears added (not sure about them)
  • A new neck which is a lot stronger
  • Printed a case for the power supply
  • Setup a Compsoft profile in MyRobotLab (took hours to get this to work well)

 

One-off the list

So one of my goals for the year was to finish the Inmoov robot, and I can say pretty confidently that I have achieved this. If the robot stays in 1 piece for the show, I can also say it is classed as reliable.

Anycubic Kobra 2 Max 3D printer

Bloody game-changer, that’s all I can say! So my work bought a 3D printer to help out with the printing needed to update the Inmoov robot because 3D printing is slow as hell. It takes hours/days to do anything large.

Not now it doesn’t, this 3D printer is at least 5-6 times faster than my old one. So 16h print takes 3 hours (standard speed, there is a sports mode!). Honestly, it’s the most epic exciting toy I have played with for ages. This opens up a lot of crazy ideas now.

Goals for the year

It’s always good to look back at the goals, let’s see if we can cross some off

This year’s Targets

  • Finish printing the 3D parts, make improvements to the body
  • Rebuild / Fix the head as the eyes do not function very well
  • Learn MyRobotLab (MRL)
  • MRL: Be able to say some words, and it can perform a custom gesture
  • MRL: Be able to track or detect people using OpenCV
    Plan to do a bit more on this
  • MRL: Be able to call a custom API
    Plan to do a bit more on this
  • MRL: Learn more about AIML (there is a Udemy course)
    Started watching the AIML course
  • Fire up the KinectV2 sensor that’s in the robot
    Not done yet, however, Inmoov does not support this.
  • Improve the wiring in the Inmoov robot as it is a mess

So over half the list has been done, 1 is no longer possible (Kinect) so I may replace that with something else. That is a great start for the year.

ROS2 mentorship

So I saw an offer on LinkedIn offering free mentorship on robotics and ROS. Of course, me being me, I applied for it and got accepted into the group. It has weekly sessions and lots of useful resources etc, however, due to me being super busy with the Inmoov robot, I am massively behind. I see this as a great way to kickstart me into ROS2, even if I can’t keep up.

Vehicles Update

So Jeff the Narava has been sold as I barely used it and it seemed a waste of money. Also, the Harley is up for sale for the same reasons. My dad gave me his 250cc scooter (well swapped for some LEGO) and my Gwiz is on the road and I am LOVING IT. The Gwiz AC version is so much fun. It’s now my main vehicle

Feb 2024: Fog Edition

Hello and welcome to another Burf Update. Warning this one could be total nonsense!

So, in the last update, I defined a plan which is below. The aim of this plan was to stay focused and achieve a level of depth in a subject (e.g. MyRobotLab). A deep dive as they say!Now the issue is that the software side of the Inmoov Robot is really good, and comes with a lot of epic features out of the box.  So my aim to deep dive into it isn’t required.

This year’s Targets

  • Finish printing the 3D parts, make improvements to the body
    I am currently rebuilding the chest with sensors etc
  • Rebuild / Fix the head as the eyes do not function very well
    I have printed out the parts for this, just not put them together 
  • Learn MyRobotLab (MRL)
    I have spent a few hours going over the software
  • MRL: Be able to say some words, and it to do a custom gesture
    See below
  • MRL: Be able to track or detect people using OpenCV
    MRL has many OpenCV libraries already implemented including face detection, face tracking and object detection (YOLO). I do need to spend some time playing with this.
  • MRL: Be able to call a custom API
    See below
  • MRL: Learn more about AIML (there is a Udemy course)
    Started watching the AIML course
  • Fire up the KinectV2 sensor that’s in the robot
    Not done yet, however, Inmoov does not support this
  • Improve the wiring in the Inmoov robot as it is a mess
    Making progress already

Customs command to call do a custom action / call a custom API

In theory, I believe I know how to do this.  There is an AIML file called oob.aiml, which is in the  ProgramAB/en-us/aiml/ directory.  This stands for Out of Band, and I believe is the place where you would put your custom command in.  When you define this custom command, you can also define a Python command for it to call.

Example:





 

 

 

 

 

So, if I say “Battery Level” to the robot, it will call this Python Function

 

 

 

 

Once we are in Python, we can kind of do whatever we want.  Call APIs, move servos, etc.  There are quite a few examples included in the software.

Now there are a lot of clever things you can do in AIML,  store values (name for example), match on wildcard commands, give out random responses from a list, etc.   Human.predicates.txt seems to be where information is stored about that session (name, age, etc).  

In Python, you can control each server from code

   i01.moveArm(“left”,43,88,22,10)

   i01.moveArm(“right”,20,90,30,10)

   i01.moveHand(“left”,0,0,0,0,0,119)

   i01.moveHand(“right”,0,0,0,0,0,119)

Details on the above can be found here

However, you can also build custom gestures using the built-in ServoMixer.

So using the above, I could easily write a voice command to pick up the word “Burf” which then calls a Python Function waveAtBurf, which controls the arm servos and says “Hello”

It has an API

MRL also has an easy-to-use API system where you can find out the methods of a service using the following GET command

http://localhost:8888/api/service/{name}/

For example, making the robot talk is as simple as http://localhost:8888/api/service/i01.chatBot/getResponse/”what time is it”

Then you can use any of the methods using POST / GET.  More info here http://myrobotlab.org/content/myrobotlab-api

So, what this potentially allows me to do is remote control the robot from any other system, so technically you could write a framework/layer/app on top of this with your own functionality that you require, but still be able to control the robot.

The Fog

So I am now scratching my head thinking, what should I do next?  The year plan involved:

  • Finishing the Inmoov robot/wiring (in progress)
  • Learning MRL (feeling pretty good about that already)
  • Fire up the KinectV2 sensor that’s in the robot (not supported it turns out)
  • ADDED: Have confidence that the robot is reliable (e.g. can run cycle gestures without breaking)

I did mention learning Ros2 and making my own robot, however these were planned for later in the year. 

Current thoughts

So by adding the target above in red, and by bringing forward the Ros2/DIY robot, I think I can get that deep dive that I am looking for.