Code for CVPR 2022 Paper: "Appearance and Structure Aware Robust Deep Visual Graph Matching: Attack, Defense and Beyond" by Qibing Ren, Qingquan Bao, Runzhong Wang, and Junchi Yan.
03/27/2022 - Our code is released.
The codes are modified based on ThinkMatch and the basic environment settings also follows it. Here we recommend users to utlize Docker for a quick setup of environments. As for maunal configuration, please refer to ThinkMatch for details.
- We maintain a prebuilt image at dockerhub:
runzhongwang/thinkmatch:torch1.6.0-cuda10.1-cudnn7-pyg1.6.3. It can be used by docker or other container runtimes that support docker images e.g. singularity. - We also provide a
Dockerfileto build your own image (you may needdockerandnvidia-dockerinstalled on your computer).
train_eval.pyis about the robust training pipeline whileeval.pyis the evaluation codes.- For
attack_utils.py, it defines the classAttackGMthat implements our locality attack including several attack baselines. - Moreover,
src/loss_func.pyimplements our regularization loss via the parent classGMLoss. src/utils/config.pydefines a global hyper-parameter dictionarycfg, which is referenced everywhere in this project.
Run training and evaluation
python train_eval.py --cfg path/to/your/yamland replace path/to/your/yaml by path to your configuration file. For example, to reproduce the ASAR-GM (config 1):
python train_eval.py --cfg experiments/config1.yamlFor reproducibility, we release the three configurations of our ASAR-GM in experiments/, namely config1.yaml, config2.yaml, and config3.yaml respectively.
To perform various while-box attacks shown in Paper, run the fllowing script:
python train_eval.py --cfg experiments/eval.yamlNote that white-box attack evaluation can be automatically performed by setting
EVAL.MODE: allTo customize your attack, please change the value of EVAL.MODE as single.
Additionally, to perform various black-box attacks shown in Paper, run the fllowing script:
python eval.py --cfg experiments/eval_blackbox.yaml --blackNote that you need to specify the model path to the variable PRETRAINED_PATH for model parameters being loaded. Your are welcome to try your own configurations. If you find a better yaml configuration, please let us know by raising an issue or a PR and we will update the benchmark!
RobustMatch provides pretrained models of the three configurations of ASAR-GM. The model weights are available via google drive.
To use the pretrained models, firstly download the weight files, then add the following line to your yaml file:
PRETRAINED_PATH: path/to/your/pretrained/weights@inproceedings{ren2022appearance,
title={Appearance and Structure Aware Robust Deep Visual Graph Matching: Attack, Defense and Beyond},
author={Qibing Ren and Qingquan Bao and Runzhong Wang and Junchi Yan},
booktitle={CVPR},
year={2022}
}
