Provides weights and configuration for the pretrained transformer model
bert-base-german-cased, published by deepset. The package uses HuggingFace's
pytorch-transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
spacy-pytorch-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
|Model size||406 MB|
|Author||deepset (repackaged by Explosion)|
pip install spacy spacy download de_pytt_bertbasecased_lg