Hello and welcome to another Burf Update, in this update we have an exciting title! So what I have I been doing?
Burf.co Blog
So I have spent some time updating the content of the blog in hopes that it will focus me on the future which for me is robotics. I hope to re-ignite my Facebook page and spread my news a little further (when there is something useful to say). I have added an AI page and broken down my robotics projects into technology (LEGO, VEX, Inmoov, and ROS).
It does help me realise that I have still achieved some cool things, not as much as I hoped but progress is continuing. I still have some content to add, so this is a WIP.
MyRobotLab + The Inmoov + LLM = InMoov Remote
So one of my goals was to understand MyRobotLab more and be able to write my own external Python scripts to control the Inmoov Humanoid Robot and the LLM (Ollama) that it uses. It’s still a work in progress but I have made a working demo which can be found here:

Features so far:
- Consume webcam feed and apply YOLO11 and/or Hand/Face detection
- Stop and start the robot listening
- Can get the robot to speak, or respond to a text input (e.g answer a question)
- Lists all responses back from MRL
- Can ask the LLM for a response
- Maps all of the services to JSON so that you can get things like a list of gestures.
Next, I want to put my own LLM layer in so that I can bypass MyRobotLab chatbot. This will allow me to add a database connection so I can record responses to review plus add a memory of sorts.
ROS2 and the Rover Robot
Good news on that front also, I have continued my ROS2 course and I have fixed the major issues with the Rover Robot when trying to get it to use NAV2 and SLAM. It was all to do with the CostMap and inflation_radius. Basically, the obstacles it detected, it inflated their area too much so that it could not plot a route around them.

The aim is to complete the ROS2 course (another course) I am doing on Udemy, refine the Rover Robot so it works a lot better using SLAM and NAV2, and then build a much better version that’s at least half-human-size.
Gwiz and predictive driving
Progress with the Gwiz has sadly stalled at the moment. First, it was the cold weather killing the batteries and now someone has smashed the windowscreen. Autoglass has tried to replace it however it has turned into a bit of a custom job as there are not many Gwizs left on the road.
AI
All of the above heavily realises on AI in some form or another. I now use ChatGTP and other LLMs on a daily basis to help me achieve tasks, at work, I am now training models across data sets to detect trends, anomalies, and insights. It is quite extraordinary how fast it has entered a large part of my life. There is no goal attached to this heading, just a reflection.
Conclusion
Not sure if conclusion is the right word, but focusing on the 3 areas above is at least a good start for me on becoming more focused. Each one could take me into an exciting area, it’s just coming up with the use case and sharing the knowledge.