Skip to content

[ICRA2024] The official implementation of paper "AdvGPS: Adversarial GPS for Multi-Agent Perception Attack"

License

Notifications You must be signed in to change notification settings

jinlong17/AdvGPS

Repository files navigation

AdvGPS: Adversarial GPS for Multi-Agent Perception Attack

paper

This is the official implementation of ICRA 2042 paper. "AdvGPS: Adversarial GPS for Multi-Agent Perception Attack". Jinlong Li Baolu Li Xinyu Liu Jianwu Fang Felix Juefei-Xu Qing Guo Hongkai Yu

IEEE International Conference on Robotics and Automation (ICRA) 2024!

teaser

Data Download

Our experiments are conducted on the publicly available benchmark datasets for V2V cooperative perception tasks: OPV2V. You can download these data from OPV2V.

Getting Started

Environment Setup

To set up the codebase environment, do the following steps:

1. Create conda environment (python >= 3.7)

conda create -n attack python=3.7
conda activate attack

2. Pytorch Installation (>= 1.12.0 Required)

Take pytorch 1.12.0 as an example:

conda install pytorch==1.12.0 torchvision==0.13.0 cudatoolkit=11.3 -c pytorch -c conda-forge

3. spconv 2.x Installation

pip install spconv-cu113

4. Install other dependencies

pip install -r requirements.txt
python setup.py develop

5.Install bbx nms calculation cuda version

python opencood/utils/setup.py build_ext --inplace

Attack your model

OpenCOOD uses yaml file to configure all the parameters for training. To train your own model from scratch or a continued checkpoint, run the following commonds:

python opencood/tools/train_attack_pose_multi_mmd.py --hypes_yaml ${CONFIG_FILE} [--model_dir  ${CHECKPOINT_FOLDER} --half]

Arguments Explanation:

  • hypes_yaml: the path of the attack configuration file, e.g. opencood/hypes_yam/point_pillar_intermediate_fusion.yaml. See Tutorial 1: Config System to learn more about the rules of the yaml files. Please see the folder of hypes_yaml
  • model_dir (optional) : the path of the checkpoints. This is used to attack the trained models.

Citation

@inproceedings{li2024advgps,
  title={AdvGPS: Adversarial GPS for Multi-Agent Perception Attack},
  author={Li, Jinlong and Li, Baolu and Liu, Xinyu and Fang, Jianwu and Juefei-Xu, Felix and Guo, Qing and Yu, Hongkai},
  booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={18421-18427},
  year={2024},
  organization={IEEE}
  }

Acknowledgment

The codebase is build upon OpenCOOD, which is the first Open Cooperative Detection framework for autonomous driving.

About

[ICRA2024] The official implementation of paper "AdvGPS: Adversarial GPS for Multi-Agent Perception Attack"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages