Skip to content
master
Go to file
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time

README.md

GigaBERT

This repo contains pre-trained models and code-switched data generation script for GigaBERT:

@inproceedings{lan2020gigabert,
  author     = {Lan, Wuwei and Chen, Yang and Xu, Wei and Ritter, Alan},
  title      = {GigaBERT: Zero-shot Transfer Learning from English to Arabic},
  booktitle  = {Proceedings of The 2020 Conference on Empirical Methods on Natural Language Processing (EMNLP)},
  year       = {2020}
} 

Fine-tuning Experiments

Please check Yang Chen's GitHub for code and data.


Checkpoints

The pre-trained models can be found here: GigaBERT-v3 and GigaBERT-v4

Please contact Wuwei Lan for code-switched GigaBERT with different configurations.

You can’t perform that action at this time.