MAhaLanobis distance with COMpression complexity pooling (MALCOM) is an out-of-distribution image detector with already deployed convolutional neural networks. This is the project for the paper "Convolutional Neural Networks with Compression Complexity Pooling for Out-of-Distribution Image Detection". The codes are mainly from deep-Mahalanobis-detector.
This project was conducted on Ubuntu 16.04 LTS with
- Nvidia GPU (TITAN Xp) + CUDA Toolkit 10.2
- Python 3.6
- Anaconda3
- PyTorch >= 1.5
- PyCUDA >= 2019.1.2
- scikit-learn >= 0.22.1
- scipy >= 1.4.1
We highly recommend to use Anaconda3 for installation of packages. An example of terminal commands for installation is as follows:
$ conda create -n malcom python=3.6
$ conda activate malcom
$ conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
$ conda install scikit-learn scipy
$ python -m pip install pycuda
Note that the instruction for the installation of PyTorch depends on the version of Nvidia CUDA toolkit on the local machine. Please follow the installation guide from here.
We use download links of out-of-distribution datasets from ODIN:
Please place them to ./data/.
We provide 9 pre-trained convolutional neural networks: VanillaCNN, DenseNet, and ResNet, where models are trained on CIFAR-10, CIFAR-100, and SVHN. Here, VanillaCNN is a simple convolutional neural network which consists of 4 convolutional layers.
- VanillaCNN on CIFAR-10 / VanillaCNN on CIFAR-100 / VanillaCNN on SVHN
- DenseNet on CIFAR-10 / DenseNet on CIFAR-100 / DenseNet on SVHN
- ResNet on CIFAR-10 / ResNet on CIFAR-100 / ResNet on SVHN
Please place them to ./pretrained/.
We provide the training script train.py for the image classifiers. Please consider two mandatory input arguments:
args.net_type:vanilla|densenet|resnetargs.dataset:cifar10|cifar100|svhn
An example of training VanillaCNN for classifying CIFAR-10 is:
$ python train.py --net_type vanilla --dataset cifar10
The training time in this case(vanilla-cifar10) is less than 30 minutes. For repeated experiments, please change the seed in the script to train different classifiers.
- MALCOM (without using OOD test samples)
$ python main.py --net_type vanilla --dataset cifar10
An example of output is:
==============================================================================
malcom detector (with vanilla trained on cifar10 w/o using ood samples):
TNR AUROC DTACC AUIN AUOUT
cifar100 16.83 64.60 60.67 61.50 65.29
svhn 74.71 95.27 90.05 92.14 96.86
imagenet_c 99.97 99.76 98.63 99.82 99.65
imagenet_r 93.74 98.66 94.40 98.64 98.68
lsun_crop 99.80 99.59 97.92 99.68 99.38
lsun_resiz 95.77 99.02 95.43 99.09 98.91
isun 93.16 98.65 94.23 98.87 98.41
==============================================================================
- MALCOM++ (with using OOD test samples)
$ python main.py --net_type vanilla --dataset cifar10 --ood_tuning 1
$ python main.py --net_type vanilla --dataset cifar10 --detector_type baseline
$ python main.py --net_type vanilla --dataset cifar10 --detector_type odin --ood_tuning 1
$ python main.py --net_type vanilla --dataset cifar10 --detector_type mahalanobis --ood_tuning 1
Note that the argument ood_tuning should be set as 1 for ODIN and Mahalanobis detector.