Feb 2024: Fog Edition

Hello and welcome to another Burf Update. Warning this one could be total nonsense!

So, in the last update, I defined a plan which is below. The aim of this plan was to stay focused and achieve a level of depth in a subject (e.g. MyRobotLab). A deep dive as they say!Now the issue is that the software side of the Inmoov Robot is really good, and comes with a lot of epic features out of the box.  So my aim to deep dive into it isn’t required.

This year’s Targets

  • Finish printing the 3D parts, make improvements to the body
    I am currently rebuilding the chest with sensors etc
  • Rebuild / Fix the head as the eyes do not function very well
    I have printed out the parts for this, just not put them together 
  • Learn MyRobotLab (MRL)
    I have spent a few hours going over the software
  • MRL: Be able to say some words, and it to do a custom gesture
    See below
  • MRL: Be able to track or detect people using OpenCV
    MRL has many OpenCV libraries already implemented including face detection, face tracking and object detection (YOLO). I do need to spend some time playing with this.
  • MRL: Be able to call a custom API
    See below
  • MRL: Learn more about AIML (there is a Udemy course)
    Started watching the AIML course
  • Fire up the KinectV2 sensor that’s in the robot
    Not done yet, however, Inmoov does not support this
  • Improve the wiring in the Inmoov robot as it is a mess
    Making progress already

Customs command to call do a custom action / call a custom API

In theory, I believe I know how to do this.  There is an AIML file called oob.aiml, which is in the  ProgramAB/en-us/aiml/ directory.  This stands for Out of Band, and I believe is the place where you would put your custom command in.  When you define this custom command, you can also define a Python command for it to call.







So, if I say “Battery Level” to the robot, it will call this Python Function





Once we are in Python, we can kind of do whatever we want.  Call APIs, move servos, etc.  There are quite a few examples included in the software.

Now there are a lot of clever things you can do in AIML,  store values (name for example), match on wildcard commands, give out random responses from a list, etc.   Human.predicates.txt seems to be where information is stored about that session (name, age, etc).  

In Python, you can control each server from code





Details on the above can be found here

However, you can also build custom gestures using the built-in ServoMixer.

So using the above, I could easily write a voice command to pick up the word “Burf” which then calls a Python Function waveAtBurf, which controls the arm servos and says “Hello”

It has an API

MRL also has an easy-to-use API system where you can find out the methods of a service using the following GET command


For example, making the robot talk is as simple as http://localhost:8888/api/service/i01.chatBot/getResponse/”what time is it”

Then you can use any of the methods using POST / GET.  More info here http://myrobotlab.org/content/myrobotlab-api

So, what this potentially allows me to do is remote control the robot from any other system, so technically you could write a framework/layer/app on top of this with your own functionality that you require, but still be able to control the robot.

The Fog

So I am now scratching my head thinking, what should I do next?  The year plan involved:

  • Finishing the Inmoov robot/wiring (in progress)
  • Learning MRL (feeling pretty good about that already)
  • Fire up the KinectV2 sensor that’s in the robot (not supported it turns out)
  • ADDED: Have confidence that the robot is reliable (e.g. can run cycle gestures without breaking)

I did mention learning Ros2 and making my own robot, however these were planned for later in the year. 

Current thoughts

So by adding the target above in red, and by bringing forward the Ros2/DIY robot, I think I can get that deep dive that I am looking for.

Jan 2024: The Plan

Welcome to another Burf update, 2 in January, it must be a special time 🙂 In the last post, I reviewed the last year and realised I didn’t really achieve any big things. I did not make a project I was super happy with. I then listed the current projects/items floating around in my head and defined some rough project paths I could take to help this year be a year of FOCUS.

Loose Ends

So to get me to the point where I can focus, I wanted to tidy up some loose ends, so that these things can be parked until I have achieved the main focus.

Electric motorbike project

I had a play, did 3mph, and need to change the gearing which is a great place to park it.

Self-driving mobility scooter / Robotic project using a wheelchair

Something I should have done long ago, instead of trying to hack the controllers, replace it. For £24 for the wheelchair (Dual Channel) and £9 for the mobility scooter (Single Channel) the problems just go away and my time at the moment is far more valuable than £30. So both of these can now be controlled via an RC controller I found in the Loft. The wheelchair will be used for a robotics project this year hopefully.

The Plan

So the focus of this year is to finish the Inmoov robot and learn MyRobotLab. I am still baffled as to why I have not focused on this amazing creation by Gaël Langevin. I have already made a great start to this however I want to detail what I want to achieve so I do not get sidetracked.

The current state of play

I moved the robot downstairs so it had more room, wired it up, fired up the software, and expected something to snap. However, bloody hell was I surprised when it just worked the first time!

I had a play with it, and after a few gestures decided that was enough greatness for the day and that stopping now, meant the day would end on a high! What I learned from this was, that I didn’t know how to use MyRobotLab as well as I hoped and that it was a super powerful system.

This year’s Targets

  • Finish printing the 3D parts, make improvements to the body
  • Rebuild / Fix the head as the eyes do not function very well
  • Learn MyRobotLab (MRL)
  • MRL: Be able to say some words, and it do a custom gesture
  • MRL: Be able to track or detect people using OpenCV
  • MRL: Be able to call a custom API
  • MRL: Learn more about AIML (there is a Udemy course)
  • Fire up the KinectV2 sensor that’s in the robot
  • Improve the wiring in the Inmoov robot as it is a mess

I think this is a great starting point and is very achievable within a few months. That is fine because once it is complete I can then move on knowing I have done a deep dive into the Inmoov Robot. I could list what I want to do after this, however, that muddies the water and that’s what I am trying to avoid.

Minor Jobs

Like any normal person, I will still have minor jobs to do (already fixed the Harley) like fixing the Gwiz, fixing the conservatory, and installing reversing sensors on Nissan. These will be fun to do, but won’t be in my robotics time. I think that’s a good way to break out stuff, robotics mainly on Wednesday afternoons which I no longer work, jobs any other time.

The January Update!

Welcome to another update, as you may notice this seems to be a monthly update. I am not sure if that is going to become a thing, or if I will go back to weekly updates. I guess I wanted to see how far I got in a month of having a day off of work.

So how far did I get? have I defined any goals or Targets? Have I taken on 15 new projects or actually tried to focus? So I believe I lost 1 or 2 Wednesdays (my day off) during January however they were required so I am not going to beat myself up about that.

InMoov Humanoid Robot

The InMoov Humanoid Robot was one of the main things I wanted to do last year and didn’t really make as much progress as I wanted due to other projects and distractions. Well, the good news is I have made great progress with this. I have learned the basics of MyRobotLab which is the software that controls it and I have mostly wired the robot up.

So after some initial testing, I have a list of things to fix, I guess this is expected but it’s going to take a while

  • Front neck piston jams
  • Lower Torse movement is not working properly (this worries me)
  • Eye Y movement servo is now dead
  • Eye X movement is a bit poo
  • Need to finish the head roll movement (probably just gluing)

I have got to say the Inmoov/MyRobotLab Discord channels are epic and the software really is very good.


Trying to kickstart this again, I am getting better but I feel I need to sign up for a course. I did some initial simple maths stuff for a work Dipoma that I was going to do but it appears I am too qualified to do the course 🙂 I am going to focus on Algebra and Geometry. I think the plan of action for the moment until I find a course is to spend an afternoon on Khan academy on it.


Lots of work has been done in the garage, and all of the LEGO that was in the LEGO room (where I built stuff) has now been moved out. The idea is to clear the building room so that I can work on bigger projects, for example, the Nissan Micra or Tigger TGB

Autonomous Car:Deep Learning & Computer Vision for Beginners

I have signed up for a Udemy course (yes I know) that will help me make the mobility scooter self-drive. Instead of being all simulated like the previous course, this uses real hardware which I think will be super useful.

Mobility Scooter / Electric Wheelchair

So I have moved both the electric wheelchair and the mobility scooter into the conservatory. I really want to make some progress on these. They have both been jacked up off the ground so that it is easier to test stuff. I have also managed to replace the analog potentiometer (controls movement) on the mobility scooter with a digital one. I now plan to make these both remote control (still using an Arduino)

Making things easy

So another area of stuff I have been giving some thought to is removing the barriers/problems/guff to certain things to make life easier.

So for example, I made an Inmoov Robot head last year which was a fantastic project. However I haven’t continued work on it because the feedback from the speakers did me in, it was so annoying that it put me off it. So this year, I have done some research and plan to fix this issue, to bring the project back to life.

Another example is Burf.co updates, deploying website/API changes was a bit of an art form because I am using my own hosting with .Net and Ubuntu. Again, I spent about an hour or 2 looking for automatic deployment platforms. I found DeployHQ which works like a charm, now if I do a push to Git, it deploys everything ready for me to put live 🙂 This will encourage me to do more work on Burf.co because it’s now easy to do deployments!

I think these small tweaks really help and so I hope to find others that just make the process of building, creating and designing more fun and enjoyable.

The Conclusion and defining some Goals

I am not sure if ‘the conclusion’ is the right word or not, but based on the previous blog post I am staying pretty focused on the main areas that I wanted to do.

I think the goals for the next few months should be:

  • Finish off wiring the Inmoov Robot off and fix the issues listed above so that the robot can be demonstrated to people.
  • Remove the grounding issue from the Red Inmoov robot
  • Add paging to Burf.co and update the database
  • Create a demo of controlling the Electric wheelchair with a PS2 controller (on order from Aliexpress)
  • Create a demo of controlling the mobility scooter with a PS2 controller
  • Spend some time at Khan Academy on Algebra and Geometry
  • Continue with self-driving courses