This repository contains the PyTorch implementation of DexDiffuser.
We introduce DexDiffuser, a novel dexterous grasping method that generates, evaluates, and refines grasps on partial object point clouds. DexDiffuser includes the conditional diffusion-based grasp sampler DexSampler and the dexterous grasp evaluator DexEvaluator. DexSampler generates high quality grasps conditioned on object point clouds by iterative denoising of randomly sampled grasps.
Paper | ArXiv | Project | Checkpoints
Published in: IEEE Robotics and Automation Letters ( Volume: 9, Issue: 12, December 2024)- Create a conda environment
conda create -n dexdiff python=3.8
conda activate dexdiff
pip install omegaconf einops urdf-parser-py hydra-core loguru plotly tqdm transformations trimesh matplotlib pyrender tensorboard tqdm transforms3d
- (optional) Install IsaacGym
Checkoints for sampler and evaluator
Place the weights in the ckpts folder
Extract object.zip into the data folder. Place the .pickle file into the dexdiffuser_data folder.
Modify the path in config paths so that the model can find the data
Train the sampler
bash scprits/train_sampler.sh
Train the evalutaor
bash scripts/train_evaluator.sh
generate grasps (set guid_scale to use EGD)
bash scripts/sample.sh
refine the generated grasps
bash scripts/refine.sh
(optional) test grasps in isaacgym
python isaac_test_right.py --eval_dir path_to_grasps
Some examples generated by DexDiffuser
This work was supported by the Swedish Research Council, the Knut and Alice Wallenberg Foundation, the European Research Council (ERC-BIRD-884807). The authors also would like to express their gratitude to Zheyu Zhuang for providing insightful feedbacks and to Ning Zhou for contributing an RTX 3090 graphics card.
If you want to cite us:
@ARTICLE{10753039,
author={Weng, Zehang and Lu, Haofei and Kragic, Danica and Lundell, Jens},
journal={IEEE Robotics and Automation Letters},
title={DexDiffuser: Generating Dexterous Grasps With Diffusion Models},
year={2024},
volume={9},
number={12},
pages={11834-11840},
doi={10.1109/LRA.2024.3498776}}
This project is licensed under the MIT License. See LICENSE for more details.

