Skip to content

UBC-NLP/infodcl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contrastive Learning of Sociopragmatic Meaning in Social Media

Chiyu Zhang, Muhammad Abdul-Mageed, Ganesh Jarwaha

Publish at Findings of ACL 2023

Code License Data License

Title

Illustration of our proposed InfoDCL framework. We exploit distant/surrogate labels (i.e., emojis) to supervise two contrastive losses, corpus-aware contrastive loss (CCL) and Light label-aware contrastive loss (LCL-LiT). Sequence representations from our model should keep the cluster of each class distinguishable and preserve semantic relationships between classes.

Checkpoints of Models Pre-Trained with InfoDCL

Model Performance

main table

Fine-tuning results on our 24 Sociopragmatic Meaning datasets (average macro-F1 over five runs).

Use of Code

We develop our models based on the scripts of SimCSE (Gao et al., 2021).

git clone https://github.com/UBC-NLP/infodcl
cd infodcl
python setup.py install

Run pre-training with InfoDCL

Running shell script

Citation

Please cite us if you use our code or models.

@article{zhang-2023-infodcl,
  author       = {
                  Chiyu Zhang and
                  Muhammad Abdul-Mageed and
                  Ganesh Jarwaha
                  },
  title        = {Contrastive Learning of Sociopragmatic Meaning in Social Media},
  booktitle    = {Findings of the Association for Computational Linguistics: {ACL} 2023},
  year         = {2023},
}

Contact Us

If you have any questions related to the code or the paper, feel free to email Chiyu Zhang (chiyuz94@gmail.com).

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published