Provides weights and configuration for the pretrained transformer model
bert-base-german-cased, published by deepset. The package uses HuggingFace's
transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
|Model size||386 MB|
|Vectors||0 keys, 0 unique vectors (0 dimensions)|
|Author||deepset (repackaged by Explosion)|
spacy-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
pip install spacy python -m spacy download de_trf_bertbasecased_lg