Building a ChatGPT-Powered Robot Assistant (BIA) (part3: Otto the Robot)

by TylerDDDD in Circuits > Robots

276 Views, 1 Favorites, 0 Comments

Building a ChatGPT-Powered Robot Assistant (BIA) (part3: Otto the Robot)

IMG 2277
IMG 2279
Capture d’écran 2024-04-05 à 21.29.23.png
Capture d’écran 2024-04-05 à 14.55.13.png

Hello

Hi there :) My nickname is TylerMaker, I live in France and am happy to share with you the lines below. This Instructables is the third in a serie of three.

What is BIA (AI) ?

BIA is an AI vocal assistant.. all in one software : ChatGPT, voice command, YouTube, Netflix, TV, Radio and web navigation. BIA is multi language, ... and BIA also provides AI to the educational Otto robot :)

Example of what you can do with it :

  • Chat with ChatGPT
  • Ask Bia to open a web page, YouTube, a radio, a TV channel or Netflix
  • Ask Bia to take a picture
  • Ask complex instruction with a If This Then That logic
  • Save complex instructions as a macro
  • Ask Otto the robot to dance, sing, walk, ...
  • And some more

The brain of Bia is the Python module biaspeech. I developed the the package and made it available for download. This instructable explains how install the module, customize it and use it together with Otto the robot

Supplies

Capture d’écran 2024-04-05 à 16.00.09.png
Capture d’écran 2024-04-05 à 20.39.42.png
Capture d’écran 2024-04-05 à 15.17.50.png

You need .. a robot :)

We purchased and built a simple base version of Otto the robot.

You can order one online on Amazon. Please note that the links below are part of the Amazon associates program.

Many other educational robots exist on the market. Only Otto was tested but the others should work. The gap will be here to adapt the config file of the biaspeech python module. Example of some other robots :

  1. Elegoo Pinguin
  2. Elogoo Smart Robot Car
  3. another robot car
  4. a robot arm

All the steps to build Otto are explained on the ottodiy website : link

The final version of Otto is as per on the screenshot.

Play around with the robot alone

You will have to install Arduino IDE and familiarize yourself with some basic scripts found online that can be uploaded to the robot. The Arduino IDE software has also to be installed and tested upfront : steps described here.

It is important that you run one or two test script of the robot. Also the Arduino libraries OttoDIYlib-master and Servo have to be installed with Arduino IDE. Everything is explained in the code section of the ottodiy website, here. When you are able to make Otto walk or danse, you can go the the next step :)

Note :

  1. The Otto scripts provided online are static. It means that one script will make the robot dance, one other will make the robot walk
  2. The approach with BIA is that you speak, then a dynamic Arduino code is generated on the fly and pushed automatically to the robot

The Concept

Capture d’écran 2024-04-05 à 14.55.13.png
Capture d’écran 2024-03-25 à 22.27.34.png
Capture d’écran 2024-04-05 à 10.57.46.png

Concept

double layer IA mechanism provide fast and accurate answers :

First layer

The first layer is powered by OpenAI (chatGPT), it takes care of answering the questions

Second layer

The second layer is processed by a native AI, it takes care of :

  • Cache management : questions and answers are cached in a local database. The goal of it is to get rid of the network latencies whenever possible (requests to chatGPT are network dependent, requests to the cache are not). The goal of it is ensuring faster answers
  • Categorization : the input is categorized with a NLP (Natural Language Processing) technique into categories like question, feedback, action, etc
  • Paraphrases : if for example you ask "how are you", and then later "how are you doing" both questions will be considered similar and the same answer will be picked up from the cache
  • Emotions : the user input is scored on several axis (positive, negative, neutral). Based on the emotion the previous answer will get promoted or deprecated in the cache
  • Scoring : each prompt/interaction is scored, in this way only the best answer is given
  • Smart actions (complex requests), like If This Then That requests, instructions to Otto Robot, etc

Plug Bia and Otto

USB.png
RPI2.png
Capture d’écran 2024-04-05 à 20.25.26.png
Capture d’écran 2024-04-05 à 20.40.14.png
Capture d’écran 2024-04-05 à 15.21.33.png

Prerequesite

As a prerequisite - please refer to the Supplies section above :

Plug BIA and Otto

  • Connect Otto with USB to one free USB port of your Laptop or Raspberry
  • Open the config file : <<python package>>/utils/config.cfg
  • Check the board parameter under the [arduino] section : should be compliant with the one in your Arduino IDE
  • Check the port parameter under the [arduino] section : should be compliant with the one in your Arduino IDE

Depending on your system, the config file can for example be :

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/biaspeech/utils/config.cfg 

Arduino section :

[arduino]
...
board = arduino:avr:pro
port = /dev/cu.usbserial-1460

The board parameter should normally not be changed. The port parameter has to be changed. In order to find the correct value, connect the Arduino/Robot with the USB to the Raspberry or laptop. Simply run :

ls /dev/tty*

Now, disconnect the board and run the command again. This way you will identify the serial device value of your Arduino. Please refer to this page for more details :) In our case the port identified was /dev/ttyUSB0 for Raspberry, and /dev/cu.usbserial-1460 for MacOS. Identify your port parameter and change the config.cfg file content with this new value :

[arduino]
...
board = arduino:avr:pro
port = /dev/ttyUSB0


Plug Bia and Otto... and Play

Capture d&rsquo;&eacute;cran 2024-04-05 &agrave; 21.29.23.png
Capture d&rsquo;&eacute;cran 2024-04-05 &agrave; 21.28.17.png

Play with it :)

Press the "Push to talk" button on the touch screen and give instructions to Otto. Try those instructions

  • Robot happy
  • Robot sing
  • Robot sad
  • Robot stop
  • Robot dance the salsa
  • ...

Note : will work also using the keyword Arduino instead of Robot

The predefined keywords for the robot are : walk, back, left, right, stop, happy, sad, surprise, moonwalkerleft, moonwalkerright, sing
A predefined keyword means that the Arduino code for the keyword/action is preloaded in Arduino. For others than those 11 predefined actions, a live request to chapGPT will dynamically build the Arduino code.
=> Example : "robot walk" .. the keyword walk is predefined... the Arduino code for it is preloaded
=> Example : "robot dance the salsa" .. this is a new action, chatGPT will generate live the Arduino code


Then you can create complex instructions with the key word Python (experimental feature ;) ). For example :

  • Python if the weather is fine then robot be happy

Finally you can create macro to simplify the chat with Otto :

  • Robot dance the salsa
  • Macro salsa

This saved the request "Robot dance the salsa under the keyword "salsa". Now try and just ask :

  • Salsa

Control Remotely the Robot

Capture d&rsquo;&eacute;cran 2024-04-07 &agrave; 12.08.19.png
Capture d&rsquo;&eacute;cran 2024-04-07 &agrave; 12.07.28.png

Play remotely ..

Instead of using the touch screen you can connect via VNC Viewer. It is a funnier way to control remotely Bia and Otto. You only need to install the VNC app. I am using an iPhone X and I installed the app RVNCViewer

The usage of the app is quite simple, you enter the IP address of the Raspberry. Then you see the BIA graphical interface on the screen of your smartphone. Push the button, ask the robot to dance or whatever question.. here we go :)

What Is Next?

Next developments of BIA (AI) ..

Many nice enhancements to be implemented, like :

  • Enhance the Camera smart action. For the moment it only takes a picture. As an enhancement could analyse the content of the picture and the output could be used by the If This Then That Python smart action. Then for example we could ask the robot to describe what it sees, to interact with the environment, make decisions, etc
  • Improve the Python action (if this then that)
  • Develop an Amazon Alexa skill that could send the instructions from Alexa to BIA and to the robot
  • Integrate BIA with other robots than Otto. Like Elegoo Pinguin, Elogoo Smart Robot Car, another robot car or a robot arm
  • Add other sensors to the robot, like temperature, light, humidity, pressure etc and have them managed by BIA
  • ...

Annexes

Small donations matter : you can buy me a coffee :)

BIA(AI) © 2024 by Nicolas CHRISTOPHE is licensed under CC BY-NC-ND 4.0. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/

The python package is available on PyPi : https://pypi.org/project/biaspeech/