Skip to content

harveyhuang18/EMR_Merging

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EMR-Merging

This is the official implementation of our NeurIPS 2024 Spotlight Paper: EMR-Merging: Tuning-Free High-Performance Model Merging (arxiv). We realize tuning-free and high-performance model merging.

We provide the code for merging ViT models, language models (including RoBERTa and GPT-2), and BEiT-3 models.

Method Framework: In the (a) Merging Procedure, we merge task-specific vectors into a unified task vector and lightweight task-specific modulators to modulate direction and amplitude. During the (b) Inference Procedure, we apply the corresponding mask and rescaler to the unified task vector to obtain a specific task vector. The process of (c)Task-specific Direction and Amplitude Modulation includes obtaining task-specific masks and scalers.

Citation

If you find this project helpful for you, feel free to cite our paper:

@inproceedings{huang_emrmerging,
 author = {Huang, Chenyu and Ye, Peng and Chen, Tao and He, Tong and Yue, Xiangyu and Ouyang, Wanli},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {A. Globerson and L. Mackey and D. Belgrave and A. Fan and U. Paquet and J. Tomczak and C. Zhang},
 pages = {122741--122769},
 publisher = {Curran Associates, Inc.},
 title = {EMR-Merging: Tuning-Free High-Performance Model Merging},
 url = {https://proceedings.neurips.cc/paper_files/paper/2024/file/dda5cac5272a9bcd4bc73d90bc725ef1-Paper-Conference.pdf},
 volume = {37},
 year = {2024}
}

Acknowledgement

Our implementation references the code below, thanks to them.

Star History

Star History Chart

About

[NeurIPS 2024 Spotlight] EMR-Merging: Tuning-Free High-Performance Model Merging

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published