Accompanying code for training VisuoSkin policies as described in the paper:
Learning Precise, Contact-Rich Manipulation through Uncalibrated Tactile Skins
ViSk is a framework for learning visuotactile policies for fine-grained, contact-rich manipulation tasks. ViSk uses a transformer-based architecture in conjunction with AnySkin and presents a significant improvement over vision-only policies as well as visuotactile policies that use high-dimensional tactile sensors like DIGIT.
- Clone this repository
git clone https://github.com/raunaqbhirangi/visuoskin.git
- Create a conda environment and install dependencies
conda create -f env.yml
pip install -r requirements.txt
-
Move raw data to your desired location and set
DATA_DIRinutils.pyto point to this location. Similarly, setroot_dirincfgs/local_config.yaml. -
Process data for the
current-task(name of the directory containing demonstration data for the current task) and convert to pkl.
python process_data.py -t current-task
python convert_to_pkl.py -t current-task
-
Install
xarm-envusingpip install -e envs/xarm-env -
Run BC training
python train_bc.py 'suite.task.tasks=[current-task]'
