Skip to content
/ vibert Public

Implementation for Variational Information Bottleneck for Effective Low-resource Fine-tuning, ICLR 2021

Notifications You must be signed in to change notification settings

rabeehk/vibert

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Variational Information Bottleneck for Effective Low-resource Finetuning

Python requirements

This code is tested on:

  • Python 3.7.7
  • transformers 2.8.0
  • pytorch 1.2.0

Donwloading the datasets

You can download the datasets from the following paths, which are from previously published papers, and put them all in "data/datasets/"

Parameters in the code

  • num_samples Specifies the number of samples in case of running models on the subsampled datasets
  • ib_dim Specifies the bottleneck size
  • ib If this option is set, runs the VIBERT model
  • deteministic If this option is set, runs the VIBERT model with beta=0
  • beta Specifies the weight for the compression loss
  • mixout defines the mixout propability
  • weight_decay defines the weight for weight_decay regularization
  • to run the model on the subsampled datasets, add --sample_train option and specify the number of samples with --num_samples N , where N is the number of samples.

Usage

We provide the following sample scripts. We using these scripts, please change bert\_path, path to the bert model.

  1. To Train BERT base model:
sh sample_commands/bert.sh
  1. To Train VIBERT model:
sh sample_commands/vibert.sh
  1. To train Dropout model:
sh sample_commands/dropout.sh
  1. To train Mixout model:
sh sample_commands/mixout.sh
  1. To train WD model:
sh sample_commands/wd.sh
  1. To train VIBERT(beta=0) used in ablation study:
sh sample_commands/bert_beta_zero.sh

Bibliography

If you find this repo useful, please cite our paper.

@inproceedings{
mahabadi2021variational,
title={Variational Information Bottleneck for Effective Low-Resource Fine-Tuning},
author={Rabeeh Karimi mahabadi and Yonatan Belinkov and James Henderson},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=kvhzKz-_DMF}
}

Final words

Hope this repo is useful for your research. For any questions, please create an issue or email rabeeh.karimi@idiap.ch or rabeeh.k68@gmail.com, and we will get back to you as soon as possible.

About

Implementation for Variational Information Bottleneck for Effective Low-resource Fine-tuning, ICLR 2021

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published