This is the official Implementation of NeurIPS 2023 paper:
Contextually Affinitive Neighborhood Refinery for Deep Clustering, authored by Chunlin Yu, Ye Shi, and Jingya Wang†
🍎 [ArXiv Paper] 🍇 [Slideslive Video]
The prerequisite for contextually affinitive neighborhood retrieval:
cd extension
sh make.sh- To begin clustering, simply run:
sh run.shwhere you can modify the config file (i.e. cifar10_r18_connr) or the number of devices ( i.e. CUDA_VISIBLE_DEVICES=0,1,2,3) in run.sh.
-
For more customized uses, you can directly modify the config file in
configs/. -
To skip the warm-up training and simply resume ConNR clustering from warm-up checkpoints, we provide the warm-up checkpoints in [Goolge Drive]. To resume training:
-
save the warm-up checkpoints into the folder
ckpt/your_run_name/save_models/ -
modify the corresponding variables
resume_nameandresume_epochin config file: -
resume ConNR clustering by running
sh run.sh. The final checkpoints of ConNR clustering are saved in [Goolge Drive]
Our framework is based on ProPos, and our ConNR implmentation is inspired by GNN Reranking.
Great thanks and gratitude for their brilliant works and valued contributions!
@article{yu2024contextually,
title={Contextually Affinitive Neighborhood Refinery for Deep Clustering},
author={Yu, Chunlin and Shi, Ye and Wang, Jingya},
journal={Advances in Neural Information Processing Systems},
volume={36},
year={2024}
}