Skip to content

Fine-tuned BERT on COVID-19 text (medical research papers)

Notifications You must be signed in to change notification settings

hoichunlaw/COVID-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

COVID-BERT: Pre-training of Deep Bidirectional Transformers for COVID-19 Text Understanding

Abstract

Fine-tuned BERT (https://arxiv.org/abs/1810.04805) on published medical research paper (only abstract section). Research paper available from Kaggle. (https://www.kaggle.com/allen-institute-for-ai/CORD-19-research-challenge)

Model URL

https://saved-language-models.s3.amazonaws.com/covid-bert-base-uncased.zip

Usage

from transformers import AutoTokenizer, AutoModel
covid_tokenizer = AutoTokenizer.from_pretrained(your_model_path)
covid_model = AutoModel.from_pretrained(your_model_path)

About

Fine-tuned BERT on COVID-19 text (medical research papers)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published