Usage example for the AllenNLP BiDAF pre-trained model
-
Updated
Oct 12, 2018 - Jupyter Notebook
Usage example for the AllenNLP BiDAF pre-trained model
Jupyter Notebook & 深度学习 & 快速上手使用教程
Pre-Training and Fine-Tuning transformer models using PyTorch and the Hugging Face Transformers library. Whether you're delving into pre-training with custom datasets or fine-tuning for specific classification tasks, these notebooks offer explanations and code for implementation.
Add a description, image, and links to the pre-trained-language-models topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-language-models topic, visit your repo's landing page and select "manage topics."