A pytorch implementation of our paper:
Unsupervised Hashing with Contrastive Learning by Exploiting Similarity Knowledge and Hidden Structure of Data
Accepted in Proceedings of the 31st ACM International Conference on Multimedia [ACM MM 2023].
- Place downloaded datasets at
datasets/
. (Cifar-10 will be downloaded automatically.) - [Optional] Refer to
utils/data_path.py
to set dataset path. - Place pre-trained models at
models/pretrained_backbones/
. (A 300-epoch SimCLR pre-trained models for Cifar-10 is available here.) - Configure training details in
configs/[dataset]/[stage]_[dataset]_[code_length].yml
.
-
Place
Cifar-10
dataset atdatasets/
manually or download automatically later. -
Configure training details in
configs/cifar-10/mine_cifar10.yml
,configs/cifar-10/cghash_cifar10_64.yml
,configs/cifar-10/selflabel_cifar10_64.yml
. -
Run shell training script.
chmod +x ./run.sh ./run.sh [CODE_LENGTH] [GPU_ID] # e.g. ./run.sh 64 0,1