How to access teleoperation command signals (intent) from Reachy2 for predictive AR digital twin visualization?

Hello everyone,

I’m currently working on a research project using Reachy2, where I’ve built a real-time AR/VR digital twin that mirrors the robot’s movements. This part is already working: the digital twin reflects Reachy’s joint and base motion correctly.

Now, I want to move beyond mirroring and implement prediction — meaning the AR digital twin should visualize what the robot is about to do slightly before the physical motion happens (e.g., a short-horizon predictive display).

To do this, I need access not just to the robot’s current state, but ideally to the teleoperation command stream or target setpoints that the operator is sending (e.g., base velocity commands, joint targets, trajectories, etc.).

My main questions:

  1. When Reachy2 is teleoperated (e.g., via VR/joystick/UI), where do the actual control commands exist?

    • Are they published as ROS2 topics (e.g., /cmd_vel, joint trajectories, etc.)?

    • Or are they only internal to the Reachy SDK / control stack?

  2. Is there a way to access:

    • Desired joint positions/velocities (targets)?

    • Base motion commands (before execution)?

    • Any kind of “intent” signal rather than a measured state?

  3. If no explicit command topics exist, is there a way to access:

    • Controller setpoints?

    • Target states before they are applied?

  4. Has anyone implemented or explored a predictive display/intent visualization with Reachy before?

My goal is to use this information to visualize the robot’s future pose in AR (e.g., 200–500 ms ahead), not just its current pose.

Any guidance on where to tap into the teleoperation pipeline, relevant topics, SDK hooks, or best practices would be hugely appreciated.

Thanks a lot!

Hi @HabilH,

For the parts of the robots :

  • Position commands are sent via WebRTC to the ROS topic /joint_commands
  • then it is sent to {part_name}/ik_target_pose by the SDK server, then it’s processed for IK and published in topic /{part}_forward_position_controller/commands.

For the mobile base :

  • The velocity commands requested by the VR are published via ROS in the /cmd_vel topic (linear.x, linear.y, and angular.z)
  • the mobile base’s HAL listens to this topic : What it will actually do based on these commands is calculated in the HAL: mobile_base/zuuu_hal/zuuu_hal/zuuu_hal.py at develop · pollen-robotics/mobile_base · GitHub In particular, the speed of the wheels is calculated in order to execute the movement, but in terms of prediction, it is not easy to accurately control the movement actually performed due to friction.
  • The expected wheel speeds are published in the topics /back_wheel_rpm, /left_wheel_rpm, /right_wheel_rpm

Good luck with your research, I hope this helps!