# Unitree G1

The Unitree G1 humanoid is now supported in LeRobot! You can teleoperate, train locomanipulation policies, test in sim, and more. Both 29 and 23 DoF variants are supported.

---

## Part 1: Getting Started

### Install the Unitree SDK

Follow the [unitree_sdk2_python installation guide](https://github.com/unitreerobotics/unitree_sdk2_python#installation). Tested with `unitree_sdk2py==1.0.1` and `cyclonedds==0.10.2`:

```bash
conda create -y -n lerobot python=3.12
conda activate lerobot
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
cd unitree_sdk2_python
pip install -e .
cd ..
```

### Install LeRobot

```bash
conda install ffmpeg -c conda-forge
conda install -c conda-forge "pinocchio>=3.0.0,/dev/null || true
sudo ip route add default via 192.168.123.200 dev eth0
echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf

# Verify
ping -c 3 8.8.8.8
```

### Install the Unitree SDK on the G1

Follow the [unitree_sdk2_python installation guide](https://github.com/unitreerobotics/unitree_sdk2_python#installation):

```bash
conda create -y -n lerobot python=3.12
conda activate lerobot
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
cd unitree_sdk2_python
python -m pip install -e .
cd ..
```

### Install LeRobot on the G1

```bash
git clone https://github.com/huggingface/lerobot.git
cd lerobot
conda install -c conda-forge "pinocchio>=3.0.0,
# Password: 123
```

---

## Part 2: Teleoperation & Locomotion

### Run the Robot Server

On the robot (from `~/lerobot`):

```bash
cd ~/lerobot
python src/lerobot/robots/unitree_g1/run_g1_server.py --camera
```

### Run the Locomotion Policy

You can run the teleoperation client from your laptop over Ethernet, over WiFi (experimental), or directly on the robot itself. Mind potential latency introduced by your network.

**From your laptop:**

```bash
lerobot-teleoperate \
  --robot.type=unitree_g1 \
  --robot.is_simulation=false \
  --robot.robot_ip= \
  --teleop.type=unitree_g1 \
  --teleop.id=wbc_unitree \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --display_data=true \
  --robot.controller=HolosomaLocomotionController
```

We support both [GrootLocomotionController](https://github.com/NVlabs/GR00T-WholeBodyControl) and [HolosomaLocomotionController](https://github.com/amazon-far/holosoma) via `--robot.controller`.

---

## Part 3: Loco-Manipulation with the Homunculus Exoskeleton

We provide a loco-manipulation solution via the Homunculus Exoskeleton — an open-source 7 DoF exoskeleton for whole-body control. Check it out [here](https://github.com/nepyope/hmc_exo).

### Calibrate

```bash
lerobot-calibrate \
  --teleop.type=unitree_g1 \
  --teleop.left_arm_config.port=/dev/ttyACM1 \
  --teleop.right_arm_config.port=/dev/ttyACM0 \
  --teleop.id=exo
```

During calibration move each joint through its entire range. After fitting, move the joint in a neutral position and press `n` to advance.

### Record a Dataset

```bash
lerobot-record \
  --robot.type=unitree_g1 \
  --robot.is_simulation=true \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --teleop.type=unitree_g1 \
  --teleop.left_arm_config.port=/dev/ttyACM1 \
  --teleop.right_arm_config.port=/dev/ttyACM0 \
  --teleop.id=exo \
  --dataset.repo_id=your-username/dataset-name \
  --dataset.single_task="Test" \
  --dataset.num_episodes=2 \
  --dataset.episode_time_s=5 \
  --dataset.reset_time_s=5 \
  --dataset.push_to_hub=true \
  --dataset.streaming_encoding=true \
  --dataset.encoder_threads=2
```

> **Note:** Omit `--teleop.left_arm_config.port` and `--teleop.right_arm_config.port` if you're only using the joystick.

Example dataset: [nepyope/unitree_box_move_blue_full](https://huggingface.co/datasets/nepyope/unitree_box_move_blue_full)

---

## Part 4: Training & Inference

### Train

```bash
python src/lerobot/scripts/lerobot_train.py \
  --dataset.repo_id=your-username/dataset-name  \
  --policy.type=pi05 \
  --output_dir=./outputs/pi05_training \
  --job_name=pi05_training \
  --policy.repo_id=your-username/your-repo-id \
  --policy.pretrained_path=lerobot/pi05_base \
  --policy.compile_model=true \
  --policy.gradient_checkpointing=true \
  --wandb.enable=true \
  --policy.dtype=bfloat16 \
  --policy.freeze_vision_encoder=false \
  --policy.train_expert_only=false \
  --steps=3000 \
  --policy.device=cuda \
  --batch_size=32
```

### Inference with RTC

Once trained, we recommend deploying policies using inference-time RTC:

```bash
python examples/rtc/eval_with_real_robot.py \
  --policy.path=your-username/your-repo-id \
  --policy.device=cuda \
  --robot.type=unitree_g1 \
  --robot.is_simulation=false \
  --robot.controller=HolosomaLocomotionController \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --task="task_description" \
  --duration=1000 \
  --fps=30 \
  --rtc.enabled=true
```

---

## Additional Resources

- [Unitree SDK Documentation](https://github.com/unitreerobotics/unitree_sdk2_python)
- [GR00T-WholeBodyControl](https://github.com/NVlabs/GR00T-WholeBodyControl)
- [Holosoma](https://github.com/amazon-far/holosoma)
- [LeRobot Documentation](https://github.com/huggingface/lerobot)
- [Unitree IL LeRobot](https://github.com/unitreerobotics/unitree_IL_lerobot)

---

_Last updated: March 2026_

