This is the code repository of "Understanding Multimodal Deep Neural Networks: A Concept Selection View". The paper is accepted by CogSci 2024.

- Use
helper/prepare_concept_bank.ipynbto establish the concept library. - Use
helper/image_representation.pyto get the image representations. - Use
helper/clip_label.pyto annote the concepts by CLIP.
-
Use
rough_selection.ipynbto conduct the greedy rough selection. -
To run experiments, you can refer the command:
bash scripts/example.sh
--algorithm can be chosen from from lp, cbm and mask.
- Use
fine_selection.ipynbto conduct the mask fine selection.
If you find this code useful, please consider citing our paper:
@inproceedings{shang2023understanding,
title={Understanding Multimodal Deep Neural Networks: A Concept Selection View},
author={Shang, Chenming and Zhang, Hengyuan and Wen, Hao and Yang, Yujiu},
booktitle={Proceedings of the Annual Meeting of the Cognitive Science Society},
volume={46},
year={2023}
}