Skip to content

@ines ines released this Aug 2, 2019 · 17 commits to master since this release


File checksum: 5c5f585bbe23f8c1d565a52cec3400467892cbeb8e1a62ba45335d8c209b275c

Provides weights and configuration for the pretrained transformer model bert-base-german-cased, published by deepset. The package uses HuggingFace's pytorch-transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Requires the spacy-pytorch-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.

Feature Description
Name de_pytt_bertbasecased_lg
Version 2.1.0
spaCy >=2.1.7
Model size 406 MB
Pipeline sentencizer, pytt_wordpiecer, pytt_tok2vec
Sources bert-base-german-cased
License MIT
Author deepset (repackaged by Explosion)


pip install spacy
spacy download de_pytt_bertbasecased_lg
Assets 3
You can’t perform that action at this time.