Skip to content

csjfwang/Forecast-PEFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Forecast-PEFT: Parameter-Efficient Fine-Tuning for Pre-trained Motion Forecasting Models

Jifeng Wang1    Kaouther Messaoud2    Yuejiang Liu3    Juergen Gall1    Alexandre Alahi2   
1University of Bonn    2EPFL    3 Stanford University   

arXiv PDF

Highlight

  • Our method makes full use of the pretrained encoder and decoder, with simple yet effective PEFT design.
  • Forecast-PEFT archives higher accuracy with only 17% tunable parameters.
  • Forecast-FT, our fully fine-tuned variant, demonstrates up to a 9.6% enhancement over the baseline.

Getting Started

Setup Environment

1. Clone this repository:

git clone https://github.com/csjfwang/Forecast-PEFT.git
cd forecast-peft

2. Setup conda environment:

conda create -n forecast_peft python=3.8
conda activate forecast_peft
sh ./scripts/setup.sh

3. Prepare Argoverse 2 Motion Dataset: we use the same preprocessing as Forecast-MAE.

Training

1. Pre-training (Optional, our pretrained_weights: download)

python3 train.py data_root=/path/to/data_root model=model_mae gpus=4 batch_size=32

2. Finetuning: Forecast-PEFT

python3 train.py data_root=/path/to/data_root model=model_forecast_peft gpus=4 batch_size=32 monitor=val_minFDE6 'pretrained_weights="/path/to/pretrain_ckpt"'

3. Finetuning: Forecast-FT

python3 train.py data_root=/path/to/data_root model=model_forecast_ft gpus=4 batch_size=32 monitor=val_minFDE6 'pretrained_weights="/path/to/pretrain_ckpt"'

For Citation

@article{Wang2024ForecastPEFTPF,
  title={Forecast-PEFT: Parameter-Efficient Fine-Tuning for Pre-trained Motion Forecasting Models},
  author={Jifeng Wang and Kaouther Messaoud and Yuejiang Liu and Juergen Gall and Alexandre Alahi},
  year={2024},
  journal={arXiv preprint arXiv:2407.19564},
}

This repo is developed based on Forecast-MAE, thanks for their great work. Please also consider citing:

@article{cheng2023forecast,
  title={{Forecast-MAE}: Self-supervised Pre-training for Motion Forecasting with Masked Autoencoders},
  author={Cheng, Jie and Mei, Xiaodong and Liu, Ming},
  journal={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  year={2023}
}

About

Forecast-PEFT: Parameter-Efficient Fine-Tuning for Pre-trained Motion Forecasting Models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published