From d1385715c195db525f35ee8f8451a0c7a3b6b1a8 Mon Sep 17 00:00:00 2001 From: smanjil Date: Tue, 3 Nov 2020 17:54:21 +0100 Subject: [PATCH 1/2] model details --- model_cards/smanjil/German-MedBERT/README.md | 29 ++++++++++++++++++++ 1 file changed, 29 insertions(+) create mode 100644 model_cards/smanjil/German-MedBERT/README.md diff --git a/model_cards/smanjil/German-MedBERT/README.md b/model_cards/smanjil/German-MedBERT/README.md new file mode 100644 index 0000000000000..ebeb22106fba8 --- /dev/null +++ b/model_cards/smanjil/German-MedBERT/README.md @@ -0,0 +1,29 @@ +# German Medical BERT + +This is a fine-tuned model on Medical domain for German language and based on German BERT. + +## Overview +**Language model:** bert-base-german-cased + +**Language:** German + +**Fine-tuning:** Medical articles (diseases, symptoms, therapies, etc..) + +**Eval data:** NTS-ICD-10 dataset (Classification) + +**Infrastructure:** Gogle Colab + + +## Details +- We fine-tuned using Pytorch with Huggingface library on Colab GPU. +- With standard parameter settings for fine-tuning as mentioned in original BERT's paper. +- Although had to train for upto 25 epochs for classification. + +## Performance (Micro precision, recall and f1 score for multilabel code classification) +![performance](https://raw.githubusercontent.com/smanjil/finetune-lm/master/performance.png) + +## Author +Manjil Shrestha: `shresthamanjil21 [at] gmail.com` + +Get in touch: +[LinkedIn](https://www.linkedin.com/in/manjil-shrestha-038527b4/) From 9de30d03bbdecc70a14b667d11a7c38f4853a533 Mon Sep 17 00:00:00 2001 From: Julien Chaumond Date: Fri, 6 Nov 2020 09:19:56 +0100 Subject: [PATCH 2/2] Apply suggestions from code review --- model_cards/smanjil/German-MedBERT/README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/model_cards/smanjil/German-MedBERT/README.md b/model_cards/smanjil/German-MedBERT/README.md index ebeb22106fba8..a1b86607d6ed9 100644 --- a/model_cards/smanjil/German-MedBERT/README.md +++ b/model_cards/smanjil/German-MedBERT/README.md @@ -1,3 +1,7 @@ +--- +language: de +--- + # German Medical BERT This is a fine-tuned model on Medical domain for German language and based on German BERT.