Skip to content

@explosion-bot explosion-bot released this Oct 8, 2019 · 67 commits to master since this release



File checksum: f9f27bfd138f5b55b3177bb2d933d1825a107275f599c11797f9c2f5dea048b4

Provides weights and configuration for the pretrained transformer model bert-base-german-cased, published by deepset. The package uses HuggingFace's transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Feature Description
Name de_trf_bertbasecased_lg
Version 2.2.0
spaCy >=2.2.1
Model size 386 MB
Pipeline  sentencizer, trf_wordpiecer, trf_tok2vec
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources bert-base-german-cased (deepset)
License MIT
Author deepset (repackaged by Explosion)

Requires the spacy-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.


pip install spacy
python -m spacy download de_trf_bertbasecased_lg
Assets 3
You can’t perform that action at this time.