Skip to content

THU-VCLab/RegionNormalizedGrasp

Repository files navigation

Region-aware Grasp Framework with Normalized Grasp Space for Efficient 6-DoF Grasping

CoRL 2024

Official code of paper Region-aware Grasp Framework with Normalized Grasp Space for Efficient 6-DoF Grasping

The codebase is mainly built based on our former work HGGD. It is highly recommended to read and run HGGD before using this repo.

Framework

framework

Requirements

  • Python >= 3.8
  • PyTorch >= 1.10
  • pytorch3d
  • numpy==1.23.5
  • pandas
  • cupoch
  • numba
  • grasp_nms
  • matplotlib
  • open3d
  • opencv-python
  • scikit-image
  • tensorboardX
  • torchsummary
  • tqdm
  • transforms3d
  • trimesh
  • autolab_core
  • cvxopt

Installation

This code has been tested on Ubuntu20.04 with Cuda 11.1/11.3/11.6, Python3.8/3.9 and Pytorch 1.11.0/1.12.0.

Get the code.

git clone https://github.com/THU-VCLab/RegionNormalizedGrasp.git

Create new Conda environment.

conda create -n rngnet python=3.8
cd RegionNormalizedGrasp

Please install pytorch and pytorch3d manually.

# pytorch-1.11.0
pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113
# pytorch3d
pip install fvcore
pip install --no-index --no-cache-dir pytorch3d -f https://dl.fbaipublicfiles.com/pytorch3d/packaging/wheels/py38_cu113_pyt1110/download.html

Install other packages via Pip.

pip install -r requirements.txt

Usage

Checkpoint

Checkpoints are located in the folder

Preprocessed Dataset of HGGD

Preprocessed datasets (realsense.7z/kinect.7z) can be downloaded from Tsinghua Cloud

Containing converted and refined grasp poses from each image in graspnet dataset

Train

Training code will be released with our pre-processed patch-level grasp dataset.

Test

Download and unzip our preprocessed datasets (for convenience), you can also try removing unnecessary parts in our test code and directly reading images from the original graspnet dataset api.

Run test code (read rgb and depth image from graspnet dataset and eval grasps).

bash test_graspnet.sh

Attention: if you want to change camera, please remember to change camera in config.py

Typical hyperparameters:

center-num # sampled local center/region number, higher number means more regions&grasps, but gets slower speed, default: 48
embed-dim # network width, default: 256
patch-size # patch size for RNGNet, default: 64
local-k # grasp detection number in each local region, default: 10
scene-l & scene-r # scene range, train: 0~100, seen: 100~130, similar: 130~160, novel: 160~190
input-h & input-w # downsampled input image size, should be 640x360
local-thres & heatmap-thres # heatmap and grasp score filter threshold, set to 0.01 in our settings
dataset-path # our preprocessed dataset path (read grasp poses)
scene-path # original graspnet dataset path (read images)
num-workers # eval worker number
dump-dir # detected grasp poses dumped path (used in later evaluation)

Demo

Run demo code (read rgb and depth image from file and get grasps).

bash demo.sh

Typical hyperparameters:

center-num # sampled local center/region number, higher number means more regions&grasps, but gets slower speed, default: 48
embed-dim # network width, default: 256
patch-size # patch size for RNGNet, default: 64

Citation

Please cite our paper in your publications if it helps your research:

@inproceedings{
chen2024regionaware,
title={Region-aware Grasp Framework with Normalized Grasp Space for Efficient 6-DoF Grasping},
author={Siang Chen and Pengwei Xie and Wei Tang and Dingchang Hu and Yixiang Dai and Guijin Wang},
booktitle={8th Annual Conference on Robot Learning},
year={2024},
url={https://openreview.net/forum?id=jPkOFAiOzf}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published