January 2026: Happy New Year Edition

Hello and welcome to another Burf Update. I hope everyone had an amazing Christmas and an epic New Year. The Christmas period rushed through at lightning speed, but was enjoyable. In this edition, I hope to map out what I plan to do in 2026.

Burf Robotics
Burf Robotics is the key focus for 2026 in that I want to make sure it is properly operating as a company now that it has its first customer (OLO Robotics), which moves it from an idea to a reality. I am also working on its first product Burf Platform.

Burf Platform
Over the festive period, I spent much of my spare time working on the platform. The current vision remains a remote teleoperation platform that supports multiple types of robots. So far, I have developed drivers for:

  • Sanbot Elf
  • Afobot
  • ROS2 Mobile robots
  • Double 2 Robot

The ROS2 driver is the one I am most excited about, as it has the potential to open access to a wide range of robots. My aim is to make integrating the driver as simple as possible, mapping the camera feed (via WebRTC), sensors, and teleoperation topics directly into the platform. The next step is to set up hosting for the platform.

ANYCUBIC Photon Mono 4
So on a different note, I got a resin printer for Christmas, which was awesome. I am super impressed with the printing quality; however, it is a lot messier and smellier than normal 3D printing. I hope it will expand what I am able to print for my projects.

ROS2
I have continued my learning on ROS2, and I am really enjoying it. I still have an awful lot to learn. It feels the more I understand, the more I need to learn. However, for my needs, I am making good progress. I need to get back to the Inmoov ROS2 driver (Sim) I updated and then work on a hardware driver for that. I also want to take a look at Foxglove, which allows you to visualise all of your ROS data.

Brilliant
Something I had not mentioned for a while is that I am still doing Brilliant.org daily and am up to over 400 days now. I do need to actually focus on expanding my learning here instead of going for the easy topics.

Reviewing 2025
At the start of 2025, I set 3 high-level goals:

  • MyRobotLab + The Inmoov + LLM
  • ROS2 and the Rover Robot
  • Gwiz and predictive driving

The year didn’t go as planned due to my redundancy, but I ended up doing more robotics than in any previous year. This led to the formation of Burf Robotics, and I successfully met my MyRobotLab and ROS2 goals. The Gwiz was always intended as a fun side project, so I’m not concerned about that. My conservatory has now become a dedicated robotics space, and overall, I feel I’ve gained a significant amount of knowledge over the past year.

Plan for 2026
I haven’t fully stepped back to define the plan in its entirety yet; however, I do have several clear areas of focus for 2026:

  • Strengthen Burf Robotics and its offerings
    Target: Gain at least one additional customer.
  • Launch Burf Platform as an MVP
    Target: Deliver an MVP with user management, robot management, and support for a range of robot platforms.
  • Continue learning ROS2
    Target: Deepen my understanding of Nav2, MoveIt, and integrating AI with ROS2.
  • Inmoov Robots
    Target: Focus more on interactive demonstrations rather than building additional robots. I would still like to develop a mobile InMoov platform.
  • Continue Brilliant learning
  • Build my own robot product
    I am not fully committed to this yet; however, I did begin building my own robotic arm. Developing my own robot could be a valuable learning experience, particularly by designing and implementing the ROS2 drivers around custom hardware. This would focus on a robotic system with arms rather than a mobile platform.

In my next update, I hope to solidify the plan.

December 2025: Christmas Edition

Hello and welcome to another Burf Update, I hope everyone is enjoying the run up to Christmas and that special time off with loved ones. Around this time I start to work out what I have achieved this year.

OLO Robotics
Formally BOW, I am chuffed to announce that I am contracting for them 1 day a week (outside my day job) testing their robotics platform. This is through Burf Robotics which means it has it’s first customer. They have lent me a Leo Robot to put their platform through its paces which is a great bit of ROS2 kit. This proves that my learning with ROS2 and setting up Burf Robotics was a good idea and also keeps me doing robotics regularly .


ROS2 Rover
I have also made some improvements to the ROS2 3D Printed Rover I built. I have added an IMU (MPU 6050 instead of MPU9250 due to counterfeit parts) however this introduces me in to sensor fusion as you need to fuse the ODOM with the IMU data. I need to spend more time on this but it should improve the rover and my knowledge of ROS2.

Double Robotics 2
A bit of a random buy one night but I now have an old Double Robotics 2 robot. This is like a tele-presence robot with a an iPad on top. It is actually a nice bit of kit and I hope to add it to my Burf Platform as my first iOS Robot. Most of my other robotics run Android


Burf Platform
So I am working on a idea around having a platform that allows remote control of old / hobbiest robots for people who can’t program there own robots but still want to use them. It’s still early stages but I hope to have a prototype up soon similar to the remote control app for the Sanbot robot. I want to use this as a learning project to learn a new technology, just deciding on what that is.

Claude Code
So about a month ago I moved from Cursor (brief use) to Claude Code and am amazed on how powerful it is. I mean it can do so much more compared to Cursor and ChatGPT. I would say the limits on the it are very tight compared to Cursor, however it is still a game changer.

Reviewing the Plan

So here was the latest plan from September before I lost my job:

  • Create a Sanbot Remote Control App
  • Continue to deepen my ROS2 knowledge
  • Continue Developing the InMoov Humanoids
  • Have Fun with the G-Wiz

So the G-Wiz has been paused for the moment as it needs new batteries, isn’t taxed (No longer free to tax) or insured and so there is too much up front effort to fun ratio right now.

I have not done a great deal with the Inmoov Humanoids because (as I have just remembered) I started building my own simple arm, to later build a whole robot as I wanted something I could own myself. The Inmoov robotics are fantastic however they can not be used for commercial gain. I think I may get back to this, however also need to work on the 3rd Inmoov robot

I have continued to improve my ROS2 knowledge and the work with OLO Robotics can only help this. This also shows Burf Robotics has potential.

The Sanbot Remote app, which will extend to the Afobot robot and then later the Double robotics 2 robot, will become the Burf Platform (Name to change) which I have a lot of ideas for. After the OLO Robotics work, the Burf Platform will be my focus for a bit.

So on reflection, the last month has actually involved a lot more robotics than I expected after losing my job.

October 2025: All Change Edition

Hello and welcome to another Burf Update — this one’s a little different, and honestly, a bit sad. I can’t imagine there’ll be much robotics content in this one… but let’s see 🙂

Compsoft
Sadly, I (along with many others) have been made redundant from Compsoft. I’d just returned from my holiday when — boom — the news hit. It’s definitely a shock, though I understand the business has been struggling for some time.

I’ve spent around 18 years at Compsoft (with a few years away in between), and it’s been an amazing journey — great people, great projects, and so many memories.

Of course, this changes my future quite a bit. The job market feels chaotic compared to what I’m used to, but I’m incredibly thankful to say I’ve already been offered a new job, which I’ve accepted.

Burf Robotics
In the middle of all this, I wasn’t sure what the future would look like — but I’ve gone ahead and set up Burf Robotics Ltd!

It’s still early days, but I’m excited to see where it might go in the limited time I’ll have around my new role. Robotics has always been a passion, and this feels like a natural next step.y new job.

Milton Keynes Smart City and Robotics Competition
With all the Compsoft changes, the show we’d originally signed up for got re-themed as Burf Robotics — and we even made a brief appearance on the BBC, which was a nice surprise!

It’s been a rollercoaster, but I’m hopeful that new beginnings will come out of it all.

I also managed to get interviewed for a Robotics Podcast

Reviewing the Plan
Outside of all this, I’ve mainly been focused on finding a new role and training up for it.

Right now, the plan is simple: do a great job in my new position — that’s my top priority. After that, I’ll review which robotics projects I can realistically take on with the limited time I’ll have.

It’s a bit sad that after achieving so much this year with robotics, things have ended this way… but I’m determined that it won’t be the end of the journey. Just a pause — and maybe the start of something new.

September 2025: Head Down Edition

Hello and welcome to another Burf Update, I struggled for a bit to pick a title for this edition due to the range of robotic things I have done over the last month. I have certainly spent a lot of time doing robotics.

Gwiz Update
I have only spent a few hours on the Gwiz, I am trying to work out which batteries are still iffy. I have also installed a battery meter and a 10″ Android headunit which I hope to be able to deploy my own apps to. This allow me to record driving data etc.

Afobot
Back when I bought my 17 Sanbot Elf robots, I was also given 40 (ish) Afobot robots. These are little desktop based things with a screen that can move side to side and up and down. They run Android 7 and initial investigations seemed like they had no SDK and no real use. However with the help of ChatGPT, I managed to write some code to control the motors. Next plan is to connect it to my websockets project and support teleop.

ROS2, the Rover Robot and the Robotic Arms
So I have continued to learn ROS2, overcoming some headaches etc. I still need to learn more, however the knowledge is slowly sinking in. One awesome thing I did get going was simulating the Inmoov robot in Rviz2 and Gazebo. This was super cool but nearly broke my brain. A lot more work needs to be done but its a positive start.

I also built a setup so that I could place a LEGO brick and my robotic arm would pick it up (myCobot 280)

Inmoov Update
As for the Inmoov humanoid robots, I am preparing 2 of them for a show this Month. In preparing Robob, its arm snapped off which I have now rebuilt. Robob and Reggie both run off a Raspberry Pi 5 now and I am working on making them more portable. I have not done any more work on the 3rd robot, but do plan to.

Burf.co
So, I have turned off the Burf.co Search Engine for the moment to help me focus on the future which at the moment is robotics. I want to get my own teleop robotic service going and so need to focus. The site has also moved to faster hosting (Thanks Adrian).

Reviewing the Plan
I am super keen to focus on the plan I set out last month, so lets see if I have helped towards it, or as usual got sidetracked.

Create a Sanbot Remote Control App

  • Design a controller that streams all sensor data from the Sanbot to a user interface, allowing remote movement and activation of functions.

So I have created a prototype Android app and Python script that streams sensor data including camera to a server from a Sanbot. I can also move the robot and make it speak. Next I want to be able to support multiple robots and control head/arm movement.

Continue to deepen my ROS2 knowledge

  • Fix bugs around MyCobot280/xArm hardware, so that API works well
  • Finish self-driving course on Udemy
  • Understand SLAM and mapping better so that a robot can travel between rooms
  • Add IMU support

I have continued to increase my ROS2 knowledge and am progressing with the MyCobot Arm and the self driving course. For work I have written a cool script that can control the robotic arm to pickup a LEGO brick.

Continue Developing the InMoov Humanoids

  • Have an active one for improving Vision and LLM use. Think about using a Neural Net to give personality. Think about gestures.
  • Create a mobile Inmoov robot, even if its remote
  • Consider merging this with the Sanbot remote control app

At the moment, I am focusing on getting the Inmoov robots ready for a show, however in doing so I am improving them. With the ROS2 Inmoov work happening, this only improves my likelihood of doing more ROS2 and Inmoov work.

Have Fun with the G-Wiz

  • Design a remote-control system (concept-only for now)
  • Add sensors and record journey data
  • Use that data in a self-driving AI test
  • Explore ROS integration for car control

Some small steps made by adding the Android headunit, I have lots of ideas here but mainly focusing on the fun side.

It generally looks like I have stayed on track which is good news, I hope the October update shows equal or even more progress.

August 2025: The ROS2 Edition


Hello and welcome to another Burf update! Sadly, I didn’t get a chance to publish a July post — it’s been full throttle on the robotics front. This year is really shaping up to be a robotics-heavy one, and I’m loving it.

MTC: Robotics and Automation Event
We attended the event, and the second InMoov robot did well playing Rock, Paper, Scissors. A few lessons were learned — notably, face detection struggled with different skin tones and people wearing glasses. I had (wrongly) assumed that the maturity of OpenCV would have eliminated these kinds of biases by now.

As for the Sanbot robot — it didn’t get used much, so I need to think harder about a compelling use case for it.

Gwiz Update
Back in May, the G-Wiz passed its MOT, but unfortunately, the batteries were shot. Luckily, a kind soul from the G-Wiz forum donated some replacements (seriously nice of them), and now the car is up and running. The plan — once I clear the next few tasks — is to start recording data and begin training an AI to drive it.

ROS2, the Rover Robot and the Robotic Arms
This month has been packed with ROS2 learning. I’ve completed the ROS2 Manipulator course on Udemy (and I’m nearly done with the self-driving course). I’ve successfully controlled three different robotic arms using MoveIt, Gazebo, and RViz.

Massive thanks to Compsoft for giving me time to dive deep into ROS2 — they had a cool idea involving it, which gave me the perfect excuse to learn. I still have plenty to master, but I’m feeling far more confident than I did a few months ago.

AI
ChatGPT has been incredibly helpful throughout the ROS2 journey. Sure, it’s made its share of mistakes — looping, incorrect suggestions, etc. — but it’s been a great learning tool. The bigger vision is to use AI to simplify robotic arm control, lowering the barrier to entry for others.

Burf.co
The main website has been updated to reflect my focus on robotics. The new goal is to offer consultancy and robot rentals — an exciting next step in the Burf journey!

Inmoov humanoid 3
Yes, somehow I’ve ended up with a third InMoov robot — rescued just before it was scrapped. It still needs a lot of work, but I’ve managed to piece it back together and source enough parts to upgrade the others. This one now has LCD eyes, which adds a fun new dimension.

The Plan
At the start of this year, I set myself three high-level goals:

  • MyRobotLab + The Inmoov + LLM
    (Rock paper Scissors demo)
  • ROS2 and the Rover Robot
    (ROS2-powered rover and MoveIt-controlled robotic arms)
  • Gwiz and predictive driving
    (Train an AI to drive the G-Wiz using collected data)

So far, I’ve successfully achieved the first two. The third — predictive driving with the G-Wiz — hasn’t really taken off yet, though it’s still on the roadmap.

In the meantime, I’ve also acquired a whole fleet of Sanbot Elf robots and rebranded Burf.co as a robotics-focused website.

Thanks to changes at work, I’m now spending a lot more time on robotics. That’s not only helped me hit my previous goals faster, but it’s also inspired and refined the next set of objectives I’m working towards

New Goals
These goals aren’t in priority order and will naturally evolve over time, but they represent the next phase in my robotics journey:

  • Create a Sanbot Remote Control App
    Design a controller that streams all sensor data from the Sanbot to a user interface, allowing remote movement and activation of functions.
    — Critical for the robotics rental business idea.
  • Continue to deepen my ROS2 knowledge — this remains vital for both work and personal robotics development.
    — Ongoing learning improves real-world application and problem-solving.
    • Fix bugs around MyCobot280/xArm hardware, so that API works well
    • Finish self-driving course on Udemy
    • Understand SLAM and mapping better so that a robot can travel between rooms
    • Add IMU support
  • Continue Developing the InMoov Humanoids
    These are great for showcasing my skills and drawing crowds at events.
    • Have an active one for improving Vision and LLM use. Think about using a Neural Net to give personality. Think about gestures.
    • Create a mobile Inmoov robot, even if its remote
    • Consider merging this with the Sanbot remote control app
  • Have Fun with the G-Wiz
    This one’s for fun, learning, and experimentation.
    • Design a remote-control system (concept-only for now)
    • Add sensors and record journey data
    • Use that data in a self-driving AI test
    • Explore ROS integration for car control

These are some pretty epic goals, and definitely enough to keep me both challenged and inspired. Each one strengthens my experience in robotics across multiple fronts — hardware, AI, mobility, UX, and control systems.

I’ll need to keep an eye on context switching to ensure I stay focused enough on one thing at a time to make meaningful progress.

June 2025: Rock, Paper Scissors Edition

Hello and welcome to another Burf Update, the focus on robotics has continued throughout the last month which is always a good sign!

MTC: Robotics and Automation Event
So we are exhibiting at the Robotics and Automation event at the MTC again which is a fantastic event based in Coventry. This year we are focusing on Bespoke AI and using a couple of my robots to demo it. We are taking a Sanbot Elf robot to wander around and interact with people (face detection etc). This is running a similar app to the one we wrote for the Film and Movie event last month, but with a focus on AI.

I have also been busy getting the 2nd Inmoov humanoid robot ready for the show. This is running a Rock Paper Scissor AI demo which is all offline. The robot detects faces, asks to play you at the game, then detects your hand, works out what hand you have played via a LLM and then returns a witty message via another LLM. I should find out tomorrow how well it works however in testing it has been pretty good.

Gwiz and predictive driving
The Gwiz passed it’s MOT, yup I am totally shocked. So now I am working on getting it properly kitted out. I have 1 dead battery that I am trying to replace and once that has been done I will be wiring it up with all sorts of sensors. I am pretty excited about this!

ROS2 and the Rover Robot
Very little progress this month on ROS2 learning, due to all of the other robotics stuff going on.

AI
As mentioned last month, this is continuing to have a bigger and bigger impact in the things I do on a daily basis. After the show, I hope to sit down and brainstorm the next few months goals and how to achieve them.

Conclusion
Due to work, I am getting more and more time to do robotics which is great, however I still need to define what my goal is. Am I mainly doing this for fun, or do I want to build something that turns in to a product? When I first created Burf.com search engine, the Internet was a simpler place. Now, everything is moving at such a rate, by the time I have an idea, it already exists. I like the idea of renting robots out, or building custom robotics things for events but is there a need for that?

May 2025: The Sanbot Edition


Hello and welcome to another Burf Update! This update is all about the Sanbot Elf Robot, of which I managed to acquire 17 🙂


Not the most normal thing to do—especially on the same day your partner moves in—but I’m not exactly known for doing normal things. A few of them needed work, but I’ve got about 10 of them up and running and have written my first demo app for work.


I think this is a great intro to commercial robotics. My current thoughts are to rent out the robots for events, etc., with custom software designed to engage the public and tell them about products and services—a RaaS (Robot as a Service) model.

Drummer Robot
While picking up the Sanbot robots, I also picked up this epic custom drummer robot. It needs work, and I’ll probably break it for parts, but it’s an interesting thing to study.

Inmoov2
I’ve made a little progress here, though my time has mostly been taken up by the Sanbots. I’ve started building the power management area and have added a screen to the front of the robot. I hope to have it fully running on batteries by the end of May.

ROS2 and the Rover Robot
Very little progress this month on ROS2 learning, due to all the Sanbot development.

Gwiz and predictive driving
After the mess with Autoglass—who, to be fair, did compensate me very well—I managed to fit my own windscreen. Pretty messy business, but at least it’s fixed. Now I need to prepare it for the MOT. If it fails, I guess that’s the end of that.

AI
It still amazes me how powerful Large Language Models (LLMs) are. They help me daily, and I can only see myself using them more. The Sanbot app above connects to OpenAI and uses Azure Search AI to tell clients about our products.

Engagement
It’s definitely exciting times—I proposed to my partner Ruth a few weeks back at the top of The Shard, and she said yes!

Conclusion
I think I’m making good progress with the Sanbots, and I’m hopeful to get the InMoov2 Robot moving very soon. The Sanbots give me valuable insight into how a commercial company approaches robotics—from engineering and electronics to the SDK. This has been both insightful and helpful for my journey moving forward.

The only downside is that it has pulled me away from working on ROS2 for a bit, but I’m hoping to get back to that shortly. I’m still focused on my goal in robotics—the target just keeps evolving.

April 2025: Where did March go?

Hello and welcome to another Burf update. In this update, I try to think about what happened in March! I am sure I did something.

Burf.co Blog
I have updated some of the robotics pages attached to the blog. Nothing massive, but it helps to focus on robotics, etc.

MyRobotLab + The Inmoov + LLM = InMoov Remote
So, this little side project turned out to be useful for making a demo video for a client. It was just prototyping, but still useful. I used hand signals to turn on and off the robot’s microphone, then forward all verbal commands to a custom LLM, allowing full control of the response back. The large language model had been trained on their brochure for context. I did add my own LLM bridge, as mentioned last month.

ROS2 and the Rover Robot
I have continued to learn ROS2, there is still so much to learn. I am nearly halfway through another course.

Inmoov2 on wheels

So still a work in progress (hugely), but I have built a platform attached to an old electric wheelchair and mounted my second Inmoov robot to it. It’s a bit of a beast, but you have to start somewhere. The idea is to use a Raspberry Pi 5 to control the robot, any big compute load will be sent to another computer to process (e.g, voice and LLM). I want a truly mobile robot to take and show people.

Inmoov1
So my main Inmoov robot got fired up last week. I had fitted a new arm to it and hadn’t had time to test it. This was for another demo for someone who found me on LinkedIn. I do find it an exciting time.

Gwiz and predictive driving
Autoglass has proper messed me around, and after 6 weeks, the Gwiz is still not fixed. The batteries are now degrading, and it’s not looking good, sadly. Not sure right now what to do.

AI
Still doing exciting things with AI, if only I could remember all the cool things it teaches me. I did a cool demo using ChatGPT, taking an image and working out where a robotic arm should move to. That was pretty cool.

Brilliant
I am still using this app heavily, I am now taking notes on things I learn as it’s gotten a lot harder! I think my streak is 127 days!

Birthday
So I am now 1 year older and as always, I got some new kit. I got a ROS2 book to read, an IMU sensor to play with, and a super cool ESP32-S3 development board (comes with all sorts)

Conclusion
Again, not sure there is one, but I feel I am focusing on robotics, which is the aim. My weight, sadly, has gone through the roof and needs to be reigned in!

February 2025: Igniting the fire!

Hello and welcome to another Burf Update, in this update we have an exciting title! So what I have I been doing?

Burf.co Blog
So I have spent some time updating the content of the blog in hopes that it will focus me on the future which for me is robotics. I hope to re-ignite my Facebook page and spread my news a little further (when there is something useful to say). I have added an AI page and broken down my robotics projects into technology (LEGO, VEX, Inmoov, and ROS).

It does help me realise that I have still achieved some cool things, not as much as I hoped but progress is continuing. I still have some content to add, so this is a WIP.

MyRobotLab + The Inmoov + LLM = InMoov Remote
So one of my goals was to understand MyRobotLab more and be able to write my own external Python scripts to control the Inmoov Humanoid Robot and the LLM (Ollama) that it uses. It’s still a work in progress but I have made a working demo which can be found here:

Features so far:

  • Consume webcam feed and apply YOLO11 and/or Hand/Face detection
  • Stop and start the robot listening
  • Can get the robot to speak, or respond to a text input (e.g answer a question)
  • Lists all responses back from MRL
  • Can ask the LLM for a response
  • Maps all of the services to JSON so that you can get things like a list of gestures.

Next, I want to put my own LLM layer in so that I can bypass MyRobotLab chatbot. This will allow me to add a database connection so I can record responses to review plus add a memory of sorts.

ROS2 and the Rover Robot
Good news on that front also, I have continued my ROS2 course and I have fixed the major issues with the Rover Robot when trying to get it to use NAV2 and SLAM. It was all to do with the CostMap and inflation_radius. Basically, the obstacles it detected, it inflated their area too much so that it could not plot a route around them.

The aim is to complete the ROS2 course (another course) I am doing on Udemy, refine the Rover Robot so it works a lot better using SLAM and NAV2, and then build a much better version that’s at least half-human-size.

Gwiz and predictive driving
Progress with the Gwiz has sadly stalled at the moment. First, it was the cold weather killing the batteries and now someone has smashed the windowscreen. Autoglass has tried to replace it however it has turned into a bit of a custom job as there are not many Gwizs left on the road.

AI
All of the above heavily realises on AI in some form or another. I now use ChatGTP and other LLMs on a daily basis to help me achieve tasks, at work, I am now training models across data sets to detect trends, anomalies, and insights. It is quite extraordinary how fast it has entered a large part of my life. There is no goal attached to this heading, just a reflection.

Conclusion
Not sure if conclusion is the right word, but focusing on the 3 areas above is at least a good start for me on becoming more focused. Each one could take me into an exciting area, it’s just coming up with the use case and sharing the knowledge.

January 2025: Happy New Year Edition

Hello and welcome to another Burf Update, I hope you all had a lovely time over the festival season.

New Year, new goals.
So I am trying to break the mold of always doing the same things every year however it has worked pretty well. I set some high-level goals, slowly refine them through the year, and usually hit them. However, I guess my criticism of myself is there are generally too many (just look at how many random projects I had last year) and I hit each of them in a shallow way instead of an in-depth way.

Last year, I deepened my knowledge of MyRobotLab, focused more on working with the Inmoov Robot than ever before, and began learning ROS2.

At the moment I do have 3 high-level projects that are very similar to last year however I want to take it to a greater depth.

They are:

  • MyRobotLab + The Inmoov + LLM
  • ROS2 and the Rover Robot
  • Gwiz and predictive driving

MyRobotLab
So last year I did set myself some targets around learning this, and within a month I had smashed them however I really still don’t know how it works that well. My suggestion for this year is to properly learn how the services work, how to subscribe and publish, how to integrate an LLM (Llama) and get the 2nd Inmoov robot to have conversations with people. Maybe the output of this would be to document it all.

ROS2
So last year I followed a ROS2 tutorial and built a rover robot, that uses Slam and Nav2. However, it doesn’t really work too well and I do not really understand a lot of it. ROS2 is huge and quite complex, so properly investing time in this could be a good thing. I would then like to extend it to control a robotic arm (Elephant Cobot arm).

Gwiz
Now that I own my 3rd Gwiz, I think I need to accept I need one in my life! I would like to do something useful with it, maybe look back at the self-driving course I did and see if I can get it to predict driving movements. I think the sensor fusion would be interesting and I think this would later feed onto ROS2 again.

Current state of play
I am still pondering what to do with the above projects. I have already got the 2nd Inmoov robot up and running (Since starting to write this post) and working with LLava and Llama3.2. The kids find it very exciting now it can talk a lot. I plan to continue with this for a week or so more so that its refined. I have worked closely with the community on getting this working as there have been some bugs found.

Burf.co
I really do need to work out what to do with this. Maybe it can feed in to the Inmoov Robot

Inmoov Robot
The main Inmoov Robot now has a new hand and arm, it needs testing but should be completely operational. I need to review the torso to see if that still works

Brilliant Math App
Still loving this app and have neem using it for 45+ days in a row

The goal for the next blog post is to make some decisions and refine the goals.