This is the official implementation of ICRA 2042 paper. "AdvGPS: Adversarial GPS for Multi-Agent Perception Attack". Jinlong Li Baolu Li Xinyu Liu Jianwu Fang Felix Juefei-Xu Qing Guo Hongkai Yu
IEEE International Conference on Robotics and Automation (ICRA) 2024!
Our experiments are conducted on the publicly available benchmark datasets for V2V cooperative perception tasks: OPV2V. You can download these data from OPV2V.
To set up the codebase environment, do the following steps:
conda create -n attack python=3.7
conda activate attackTake pytorch 1.12.0 as an example:
conda install pytorch==1.12.0 torchvision==0.13.0 cudatoolkit=11.3 -c pytorch -c conda-forgepip install spconv-cu113pip install -r requirements.txt
python setup.py developpython opencood/utils/setup.py build_ext --inplaceOpenCOOD uses yaml file to configure all the parameters for training. To train your own model from scratch or a continued checkpoint, run the following commonds:
python opencood/tools/train_attack_pose_multi_mmd.py --hypes_yaml ${CONFIG_FILE} [--model_dir ${CHECKPOINT_FOLDER} --half]Arguments Explanation:
hypes_yaml: the path of the attack configuration file, e.g.opencood/hypes_yam/point_pillar_intermediate_fusion.yaml. See Tutorial 1: Config System to learn more about the rules of the yaml files. Please see the folder ofhypes_yamlmodel_dir(optional) : the path of the checkpoints. This is used to attack the trained models.
@inproceedings{li2024advgps,
title={AdvGPS: Adversarial GPS for Multi-Agent Perception Attack},
author={Li, Jinlong and Li, Baolu and Liu, Xinyu and Fang, Jianwu and Juefei-Xu, Felix and Guo, Qing and Yu, Hongkai},
booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
pages={18421-18427},
year={2024},
organization={IEEE}
}The codebase is build upon OpenCOOD, which is the first Open Cooperative Detection framework for autonomous driving.
