Control a 3D Reachy visualisation

Due to the covid-19 situation, most of us are confined at home and working remotely. Being a roboticist at this time, can be a bit frustrating, not everyone got a Reachy robot with you at home that you can program.

Thankfully, here comes a 3D visualisation solution to keep you busy!

Even when there is no virus around, having a visualisation tool can prove really useful.

  • It lets you try to move the simulated robot without any risk of breaking anything. It’s a good way to learn how to control Reachy without fear.
  • You can try yourself controlling a simulated Reachy before purchasing one, to better understand what it can and cannot do.
  • It can be used extensively and with as many robots as you want for instance to train your machine learning algorithm.

So, in the Pollen team one of our main goal for this year is to provide complete simulation tool for our Reachy robot.

A 3D visualisation

I’ve started working on this 3D visualisation a couple of weeks ago and the first prototype is ready to be used! It is still a work in progress and many additional features are yet to come but you can already start using it today.

It’s important to make it clear that for the moment it’s more a visualisation tool than a complete simulator. I’ll definitely want to push this software in this direction but this will take more time (see the roadmap later in this post).

This tool will let you send commands using our Python’s API to a 3D model of a full Reachy robot (both arms and a head) and see their effect on the model. On the gif below, you can see how to make your robot wave with a few lines of code:

Getting started

So, what do you need to do the same on your own computer?

First, you need to install our Python’s package. It can be found here: or directly on PyPi. It requires Python 3 and a few classical dependencies (numpy, scipy, etc). This software is the same one that runs on the real robot. I’ve simply added a new IO layer that change the communication with the hardware (motors and sensors) with a WebSocket communication that interacts with the 3D visualisation.

Both hardware and visualisation share the same API. The only important difference is when creating your robot, you need to set all io to ‘ws’:

from reachy import parts, Reachy

r = Reachy(
    right_arm=parts.RightArm(io='ws', hand='force_gripper'),
    left_arm=parts.LeftArm(io='ws', hand='force_gripper'),

Then, the visualisation can be accessed directly from You only need to have a web browser that supports WebGL. It should work on all rather recent browser. Yet mobile support is only partial at this time (See for details).

Then, if you run command using Python’s API you should see the visualisation in your web browser move!
For instance:

r.right_arm.elbow_pitch.goal_position = -80

Just remember:

  • run the Python code to create the Reachy instance
  • open the link with the visualisation or click the connect button if it’s already open but not connected
  • run all commands you want using the Python’s API


This first version lets you move the 3D visualisation of a full Reachy, but there are still many limitations:

  • We do not use any physic simulation at the moment. The motor directly teleports to their goal position using infinite acceleration. This is obviously different from the real robot. We plan to provide a simulation with physics, collision, and interaction with objects but this will take more time.
  • The camera and the orbita neck can not be controlled at the moment. We are working on this right now!

So quite a lot of cool things to come in the future, to keep us busy :slight_smile:
Do not hesitate to try visualisation and let us know what you think! We hope to share more soon.


Thank you for the nice present! :+1:

I “accidentally” closed the visualization page and it was not possible to get the connection up again. Here how I fixed it (Ubuntu 18.04)

  • Installed lsof
    sudo apt-get install lsof

  • got the TCP port used
    sudo lsof -i | grep "py"
    It returned:
    python3 .....bla.bla.bla...... TCP *:6171 (LISTEN)

  • killed the connection
    sudo lsof -t -i tcp:6171 | xargs kill -9

Thanks for the info @Andrea!

It seems that you had to basically kill and restart the Python server side. That’s strange, you should be able to connect and disconnect as many times as you want on the visualisation page.
Do you still have the issue with the new version?

No issue, the problem happens when you close the browser before disconnecting from virtual Reachy. In this case old connection stays open and the new browser is not able to connect.

I’ve updated the description on my first post as we have made a few updates.

  • First, we change the url. Now you can access the simulation here:
  • Then, we add simple control to navigate the scene rotate/zoom/pan/tilt with your mouse.
  • We also made the connection easier. We add a connect/disconnect button that let you reconnect easily. @Andrea hopefully, you won’t need to kill the Python server anymore
  • We fix a few bugs and you can now add the head part, even if it’s not controllable yet.
  • And the most visible change is that we worked on the scene. I hope you’ll like it and it will help you escape your confinement :slight_smile:


Just wondering if anyone else is getting kernel panics on their mac
when running

Its repeatable - whenever I load this page and try to use via reachy python API
my mac will kernel panic.

Appears to happen on both chrome and firefox.

On chrome it will kernel panic, on firefox itt will either
kernel panic or at times firefox will catch it and just terminate the tab.

Hi @Geoff,

Sorry to hear about your trouble. From my understanding kernel panic usually comes from hardware issue. Our simulator is built on WebGL that tries to use your GPU. Maybe that’s where the problem comes from?