🔥🔥🔥 Our arxiv version is currently available. Please check it out! 🔥🔥🔥
Figure1: Overview of Deep Prompt-Tuning vs. CaPT (ours) Frameworks. (a) Original Deep Prompt-Tuning. (b) The overall architecture of our proposed CaPT, integrating both task-aware guidance and instance-aware signal to trigger “attention anchor”.
git clone https://github.com/comeandcode/CaPT.git
cd CaPT
conda create -n capt python=3.9.18 -y
conda activate capt
pip install -r requirements.txt
python setup.py installpython data/superglue/get_huggingface_superglue.pysh train.sh@inproceedings{liu2025all,
title={All You Need is One: Capsule Prompt Tuning with a Single Vector},
author={Liu, Yiyang and Liang, James Chenhao and Fan, Heng and Yang, Wenhao and Cui, Yiming and Han, Xiaotian and Huang, Lifu and Liu, Dongfang and Wang, Qifan and Han, Cheng},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
}Our training pipeline and data processing module are implemented based on code base SMoP. We thank the authors for their effort.
