Skip to content

shreyas-kowshik/nlp4if

Repository files navigation

NLP4IF

Code for the paper Multi Output Learning using Task Wise Attention for Predicting BinaryProperties of Tweets on the Shared-Task-On-Fighting the COVID-19 Infodemic, NLP4IF workshop, NAACL'21.

This is the code for team dunder_mifflin bagging runners-up position on the English subtask of the competition.

Authors : Shreyas Kowshik, Ayush Suhane.

Overview

Multi-Head Task-Wise Attention

Proposed Architecture

Instructions to train

The codebase uses wandb for visualizations and progress tracking.

Install wandb as :

pip install wandb
wandb login

Now go over to the link in the terminal and paste your API key.

Note : Before training make sure to add your wandb credentials as :

wandb.init(name=args.wandb_run, project='project_name', entity='entity_name')

in the files bert_train.py and roberta_train.py (line 68 in each file).

We used Google Colab for all our training. So the requirements.txt file corresponds to all the packages installed by default when one runs a new Colab session. Any additional dependencies are installed using the commands below.

git clone https://github.com/shreyas-kowshik/nlp4if.git
cd nlp4if
bash setup.sh

python roberta_train.py -bs 32 -lr 5e-5 -lr_emb 5e-6 -e 60 -wdbr [wandb_run_name] -model roberta_attn_classwise --base roberta-base --save_model True -dtp data/english/v3/v3_augmented/covid19_disinfo_binary_english_train.tsv -ddp data/english/v3/v3/covid19_disinfo_binary_english_dev_input.tsv

Pretrained Model

Final architecture weights link.

About

Code for the runners up entry on the English subtask on the Shared-Task-On-Fighting the COVID-19 Infodemic, NLP4IF workshop, NAACL'21.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published