How to make your Reachy plays tic-tac-toe

Perhaps the demo uses a different version of edgetpu package? Here is what I see via pip3 show:

Name: edgetpu
Version: 2.14.1
Summary: Edge TPU Python API
Home-page: https://coral.googlesource.com/edgetpu
Author: Coral
Author-email: coral-support@google.com
License: Apache 2
Location: /home/test/miniconda3/lib/python3.7/site-packages
Requires: numpy, Pillow
Required-by:

Hi Charles,

Usually this error is because you didn’t recover the model ‘ttt-boxes.tflite’ but the link to git lfs.
Did you do this?

cd ~/dev/reachy-tictactoe
git lfs pull
1 Like

Thank you, Simon. You are right. The model file ttt-boxes.tflite that I cloned from github was just a place-holder. I did not know about Git LFS package. After I installed git-lfs, I got the actual model file 4.5MB size and that error message went away.

RE edgetpu install on Pi ???

How did you install edgetpu package on the Pi controller on Reachy?

I followed the instructions here:

coral.ai/software/#edgetpu-python-api

and downloaded the generic Linux wheel package:

edgetpu-2.14.1-py3-none-any.whl

However, after I did 'pip3 wheel " to install without error messages, the package is still not accessible to Python. I got the module not found error:

ModuleNotFoundError: No module named ‘edgetpu’

Strangely, I was able to install edgetpu on my Ubuntun 18.04. computer with the same method. However, I got the error message when I tried to load the model ttt-boxes.tflite:

File “/home/test/miniconda3/lib/python3.7/site-packages/edgetpu/basic/basic_engine.py”, line 92, in init
self._engine = BasicEnginePythonWrapper.CreateFromFile(model_path)
RuntimeError: Internal: Unsupported data type in custom op handler: -1473687232Node number 0 (edgetpu-custom-op) failed to prepare.
Failed to allocate tensors.

Did I miss something?
Please advise. Thank you.

Charles

Hi Charles,
To install the edgetpu library on RPi, check out section ‘Setup Coral toolkit’ from this page of our documentation.

Thank you, Simon. That helps. We have the demo somewhat working now. However, our arm is flexing and shaking when it moves. This may be a quality issue with our 3D printed parts. Here is our test video:

2 Likes

I think we also removed or haven’t implemented the smoothing trajectories either from what I saw. We should play with that

We started running tic-tac-toe game_launcher.py and saw lots of JPG corrupt image messages. Is it going to cause a problem with the game and if so how to fix it ?

Hi @annag5555 ,
The problem of JPG corrupt image is because of the Logitech’s driver but it should not cause any issue in game, the images are still captured correctly (it is just annoying to see in the terminal unfortunately).
We are using better cameras in the new version of Reachy which have higher quality and not this messages of corrupted data!

1 Like

Bonjour,

Je n’ai pas réussi à trouver de la documentation qui explique la position de départ de l’échiquier (un cylindre au milieu ? rien ?) ?
Pareil pour la position de base des pions, j’ai cru comprendre qu’ils se trouver à droite sur le coter de l’echiquier ?

Bonjour @Thea,
Les positions des pions sont détaillées dans la page Notion que nous avons faite pour l’application. Le premier cylindre attrapé par Reachy sera le 1 puis 2 etc, jusqu’au 5.

@Simon We are trying to use tic-tact-toe with the new Reachy 2021 Python SDK :
I guess that all the *.npz files in the move/ directory must be rebuild to take into account the new names of the parts of Reachy’s arms ?
The way to do is is to record each needed movement and np.savez the trajectories as *.npz files ?

Hi @Jean_Luc,
Indeed the name of each Reachy’s parts should be changed with the new name, I updated the npz files, you can find them here. They should work with Reachy 2021. Make sure to check them alone before using them in the TicTacToe app.

2 Likes

Hi,
Thanks for the npz files : it’s works fine :clap:
We currently go on updating the tictactoe program from reachy 2019 to 2021 version.
We took this notebook Here to test the classification (with some adaptations for 2021 :grinning:):
The problem now is that with the ttt-boxes.tflite file (from tictactoe 2019) we have poor classification results :
image
Should we retrain the model ?

1 Like

Great to know you’re getting close!

Yes you should re train your model, the model that we used was trained to detect our wood pieces on a cork board, so it’s not surprising that it is not working well here :slightly_smiling_face:
The tutorial to retrain your model can be found where you found the notebook, here.

1 Like

Hi :wave:

Some news about the update of tictactoe game for Reachy2021 and the classification conducted during my summer 2021 internship at ENSAM Bordeaux.

I have changed the strategy for the recognition of the game pieces: we now use Objet Recognition algorithms to detect 3 types of objects:

  • cubes
  • cylinders
  • empty case.

The main advantages of this algorithm is to give both:

  • the nature of the detected object (cube/cylinder/empty)
  • the associated bounding box.

By processing the bounding box data we can now compute where the detected object lies on the grid:

imageDetection

This approach avoids to cut the image into 9 small images (one for each box of the grid) and classify the 9 images, which is a bit fragile due to the difficulty to get the head precisely positioned :face_with_head_bandage:.

You can see this video showing real time object detection performance of our solution.

We have used Tensorflow 1 Object Detection API which is compatible with the Reachy edgeTPU processor. More details will be posted soon as a gitHub repository, including the all the procedure to train the model with a customized data set.

We will also publish soon the new tictactoe repository for Reachy 2021.

3 Likes

Great work! Can’t wait for the code to try on our own Reachy.

The repository for the new tictactoe for Raeachy 2021 will be available in september… still some fine-tuning to do…
In particular, concerning the processing of the objects bounding boxes to compute where to position them on the grid, @simon can you specify us the refererence frame and the position of the grid squares that is used to pass the flattened board to the rm_agent.value_actions function?

Hi @Jean_Luc,
The grid squares are defined in the board_cases array in vision.py and the frame used to get the board configuration is recovered in the analyse_board method of tictactoe_playground.py.

Hope it is what you wanted!

Hi @Thea and @Jean_Luc,
Any updates on how is it going? :blush: