Skip to content

jugechengzi/Rationalization-MCD

Repository files navigation

Due to different versions of torch, you may need to replace "cls_loss = args.cls_lambda * F.cross_entropy(forward_logit, labels)" with "cls_loss = args.cls_lambda * F.cross_entropy(forward_logit, labels.long())"

MCD

This repo contains Pytorch implementation of MCD (NeurIPS 2023 paper: D-Separation for Causal Self-Explanation). Most of our code are built on top of our previous work FR.

If the code has any bugs, please open an issue. We will be grateful for your help.

Environments

torch 1.13.1+cu11.6.
python 3.7.16.
RTX3090

Datasets

Please refer to FR.

Running example

correlated Beer (Table 2)

For the appearance aspect with sparsity being about 20%, run:
python -u decouple_bcr.py --correlated 1 --data_type beer --lr 0.0001 --batch_size 128 --gpu 0 --sparsity_percentage 0.175 --epochs 150 --aspect 0

If you have any other questions, please send me an email. I am happy to provide further help if you star this repo.
Preparing the code is really tedious, I will be appreciate if you star this repo before cloning it.

Result

Please refer to FR.

Acknowledgement

The code is largely based on Car and DMR. Most of the hyperparameters (e.g. the '--cls_lambda'=0.9) are also from them. We are grateful for their open source code.

About

NeurIPS 2023 paper: D-Separation for Causal Self-Explanation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages