Skip to content

RoboPack is a framework that integrates tactile-informed state estimation, dynamics prediction, and planning for manipulating objects with unknown physical properties (RSS 2024)

Notifications You must be signed in to change notification settings

BoAi01/robopack

Repository files navigation

RoboPack: Learning Tactile-Informed Dynamics Models for Dense Packing

RoboPack is a framework that integrates tactile-informed state estimation, dynamics prediction, and planning for manipulating objects with unknown physical properties. It extends previous work RoboCraft and RoboCook by incorporating tactile-informed physical state estimation to handle uncertainties in object properties, such as unknown mass distribution or compliance.

Packing objects with varying deformability using one initial visual observation and dense tactile feedback:

packing-1-2-12x.mp4

RoboPack: Website | Paper

If you find this codebase useful for your research, please consider citing:

@article{ai2024robopack,
  title={RoboPack: Learning Tactile-Informed Dynamics Models for Dense Packing},
  author={Bo Ai and Stephen Tian and Haochen Shi and Yixuan Wang and Cheston Tan and Yunzhu Li and Jiajun Wu},
  journal={Robotics: Science and Systems (RSS)},
  year={2024},
  url={https://www.roboticsproceedings.org/rss20/p130.pdf},
}

and the previous work that this codebase is built upon.

Environment

Dependencies have been exported to requirement.txt. The most important is to have compatible versions for torch and torch_geometric.

Sample Dataset

We provide a small sample dataset to help get started with running the pipeline. You can download it here.
After downloading, please unzip it in the project root folder:

cd robopack
unzip data.zip

The example commands below assume that the data directory robopack/data has already been set up.

Learning Tactile Auto-Encoder

First, navigate to dynamics

cd dynamics

Below is an example command for training a tactile encoder on the box-pushing dataset:

python train_tactile_encoder.py --config model_configs/estimator_predictor_tac_packing_seq25.json

In practice, we train the encoder on an aggregated dataset, which is then shared across tasks.

To generate visualizations from a pretrained autoencoder for inspection, here is an example of testing a checkpoint:

 python train_tactile_encoder.py --config model_configs/estimator_predictor_tac_boxes.json --test /home/albert/github/robopack-public/dynamics/pretrained_ae/v24_5to5_epoch=101-step=70482_corrected.ckpt

The generated visualization videos will be saved in ae_visualizations.

Dynamics Learning

To run a minimal example of dynamics learning, run one of the following the following

python train_dynamics.py --config model_configs/estimator_predictor_tac_boxes.json  # box pushing task
python train_dynamics.py --config model_configs/estimator_predictor_tac_packing_seq25.json  # dense packing task 

About

RoboPack is a framework that integrates tactile-informed state estimation, dynamics prediction, and planning for manipulating objects with unknown physical properties (RSS 2024)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages