Skip to content
/ CLCD Public

This repository is the official PyTorch implementation of Dynamic Metric Learning with Cross-Level Concept Distillation.

Notifications You must be signed in to change notification settings

wzzheng/CLCD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic Metric Learning with Cross-Level Concept Distillation

This repository is the official PyTorch implementation of Dynamic Metric Learning with Cross-Level Concept Distillation.

Framework

CLCD

Datasets

The three DyML datasets can be downloaded from here. Put the dataset files on ./datasets.

Requirements

To install requirements:

pip install -r requirements.txt

Training

To train the proposed CLCD method, run the following commands:

bash command.sh

or

bash command_product.sh

Device

We tested our code on a linux machine with two Nvidia RTX 3090 GPU cards.

About

This repository is the official PyTorch implementation of Dynamic Metric Learning with Cross-Level Concept Distillation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published