This repository contains the official implementation of the paper:
PMatch: Paired Masked Image Modeling for Dense Geometric Matching, CVPR'23, [arXiv]
Authors: Shengjie Zhu and Xiaoming Liu
# Clone the repository
git clone https://github.com/shngjz/PMatchRelease.git
cd PMatchRelease
# Install dependencies
conda create -n pmatch python=3.9
conda activate pmatch
pip install -r requirements.txtDownload the benchmark MegaDepth and ScanNet datasets from Huggingface. Please ensure you agree to the licenses for each dataset.
git clone https://huggingface.co/datasets/shngjz/ce29d0e9486d476eb73163644b050222/
mv ce29d0e9486d476eb73163644b050222 TwoViewBenchmarkDownload the pre-trained models from Huggingface using our provided script:
# Make the script executable if needed
chmod +x download_models.sh
# Run the download script
./download_models.shThis will automatically download both models and place them in the checkpoints directory.
Run a simple demo with your own images:
python PMatch/Benchmarks/demo.pyEvaluate PMatch on MegaDepth dataset:
python PMatch/Benchmarks/benchmark_pmatch_megadepth.py \
--data_path /path/to/TwoViewBenchmark/megadepth_test_1500 \
--checkpoints checkpoints/pmatch_mega.pthEvaluate PMatch on ScanNet dataset:
python PMatch/Benchmarks/benchmark_pmatch_scannet.py \
--data_path /path/to/TwoViewBenchmark/scannet_test_1500 \
--checkpoints checkpoints/pmatch_scannet.pthIf you find this code useful for your research, please cite our paper:
@inproceedings{zhu2023pmatch,
title={PMatch: Paired Masked Image Modeling for Dense Geometric Matching},
author={Zhu, Shengjie and Liu, Xiaoming},
booktitle={CVPR},
year={2023}
}