Skip to content
/ DKT Public

Official implementation of "Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning", ACL2022 main conference

Notifications You must be signed in to change notification settings

myt517/DKT

Repository files navigation

DKT

Official implementation of "Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning", ACL2022 main conference

IND pretrained models

In order to facilitate developers to quickly reproduce the results, we provide our IND pre-trained model. Developers can download it from the link below and put it in the ./pretrain_models directory

Link: https://pan.baidu.com/s/14LwMTnebJjvTgKQhYwycCg Extraction code: 0fj1

Usage

Run the experiments by:

sh scripts/run.sh

You can change the parameters in the script.

About

Official implementation of "Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning", ACL2022 main conference

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published