Official implementation of the paper "Policy Contrastive Decoding for Robotic Foundation Models".
Note: We are doing our best to improve this work. If you have any questions or suggestions, please feel free to create an issue in this repo or contact us at [email protected].
[Project] [ArXiv] [PDF] [PCD-real] [PCD-LeRobot]
- 🔥Jan, 26, 2026: 🎉🎉Our paper has been accepted by ICLR 2026!🎉🎉
- 🔥Oct 13, 2025: Our paper has been updated for better clarity and readability. The optimized version is now available on arXiv.
- 🔥May 20, 2025: The code is released and the paper is now available on arXiv.
Abstract Generalist robot policies, or robotic foundation models, hold immense potential to enable flexible, general-purpose and dexterous robotic systems. Despite their advancements, our empirical experiments reveal that existing robot policies are prone to learning spurious correlations from pre-training trajectories, adversely affecting their generalization capabilities during inference. To tackle this, we propose a novel Policy Contrastive Decoding (PCD) approach, which redirects the robot policy’s focus toward object-relevant visual clues by contrasting action probability distributions derived from original and object-masked visual inputs. As a training-free method, our PCD can be used as a plugin to improve different types of robot policies without needing to finetune or access model weights. We conduct extensive experiments on top of three open-source robot policies, including the autoregressive policy OpenVLA and the diffusion-based policies Octo and
$\pi_0$ . The obtained results in both simulation and real-world environments prove PCD’s flexibility and effectiveness, e.g., PCD enhances the state-of-the-art policy$\pi_0$ by 8% in the simulation environment and by 108% in the real-world environment.
Simulated Environments
Real-world Environments
Note: The relevant code of the real-world experiments is available in PCD-real.
- Clone this repository.
git clone https://github.com/Koorye/PCD.git- Install all dependencies.
conda create -n pcd python=3.10
conda activate pcd
bash scripts/install_dependencies.sh- Download model checkpoints.
Note: Some of the checkpoints cannot be downloaded directly, you may need to download them manually from the links provided in the script.
bash scripts/download_pretrained_weights.sh- Run evaluation on simpler.
bash scripts/default/inference/run.shOur work is built upon the following open-source projects: SimplerEnv, OpenVLA, Octo, Open Pi-0, Grounded SAM2, YOLO World, SED, Inpaint Anything. We thank the authors for releasing their code. If you use our model and code, please consider citing these works as well.



































