Feb 2024: Fog Edition

Hello and welcome to another Burf Update. Warning this one could be total nonsense!

So, in the last update, I defined a plan which is below. The aim of this plan was to stay focused and achieve a level of depth in a subject (e.g. MyRobotLab). A deep dive as they say!Now the issue is that the software side of the Inmoov Robot is really good, and comes with a lot of epic features out of the box.  So my aim to deep dive into it isn’t required.

This year’s Targets

  • Finish printing the 3D parts, make improvements to the body
    I am currently rebuilding the chest with sensors etc
  • Rebuild / Fix the head as the eyes do not function very well
    I have printed out the parts for this, just not put them together 
  • Learn MyRobotLab (MRL)
    I have spent a few hours going over the software
  • MRL: Be able to say some words, and it to do a custom gesture
    See below
  • MRL: Be able to track or detect people using OpenCV
    MRL has many OpenCV libraries already implemented including face detection, face tracking and object detection (YOLO). I do need to spend some time playing with this.
  • MRL: Be able to call a custom API
    See below
  • MRL: Learn more about AIML (there is a Udemy course)
    Started watching the AIML course
  • Fire up the KinectV2 sensor that’s in the robot
    Not done yet, however, Inmoov does not support this
  • Improve the wiring in the Inmoov robot as it is a mess
    Making progress already

Customs command to call do a custom action / call a custom API

In theory, I believe I know how to do this.  There is an AIML file called oob.aiml, which is in the  ProgramAB/en-us/aiml/ directory.  This stands for Out of Band, and I believe is the place where you would put your custom command in.  When you define this custom command, you can also define a Python command for it to call.

Example:





 

 

 

 

 

So, if I say “Battery Level” to the robot, it will call this Python Function

 

 

 

 

Once we are in Python, we can kind of do whatever we want.  Call APIs, move servos, etc.  There are quite a few examples included in the software.

Now there are a lot of clever things you can do in AIML,  store values (name for example), match on wildcard commands, give out random responses from a list, etc.   Human.predicates.txt seems to be where information is stored about that session (name, age, etc).  

In Python, you can control each server from code

   i01.moveArm(“left”,43,88,22,10)

   i01.moveArm(“right”,20,90,30,10)

   i01.moveHand(“left”,0,0,0,0,0,119)

   i01.moveHand(“right”,0,0,0,0,0,119)

Details on the above can be found here

However, you can also build custom gestures using the built-in ServoMixer.

So using the above, I could easily write a voice command to pick up the word “Burf” which then calls a Python Function waveAtBurf, which controls the arm servos and says “Hello”

It has an API

MRL also has an easy-to-use API system where you can find out the methods of a service using the following GET command

http://localhost:8888/api/service/{name}/

For example, making the robot talk is as simple as http://localhost:8888/api/service/i01.chatBot/getResponse/”what time is it”

Then you can use any of the methods using POST / GET.  More info here http://myrobotlab.org/content/myrobotlab-api

So, what this potentially allows me to do is remote control the robot from any other system, so technically you could write a framework/layer/app on top of this with your own functionality that you require, but still be able to control the robot.

The Fog

So I am now scratching my head thinking, what should I do next?  The year plan involved:

  • Finishing the Inmoov robot/wiring (in progress)
  • Learning MRL (feeling pretty good about that already)
  • Fire up the KinectV2 sensor that’s in the robot (not supported it turns out)
  • ADDED: Have confidence that the robot is reliable (e.g. can run cycle gestures without breaking)

I did mention learning Ros2 and making my own robot, however these were planned for later in the year. 

Current thoughts

So by adding the target above in red, and by bringing forward the Ros2/DIY robot, I think I can get that deep dive that I am looking for.

Leave a Reply