Skip to content

@explosion-bot explosion-bot released this Oct 8, 2019 · 67 commits to master since this release



File checksum: c2fa0f48ff5bf176a69ee0439a8d9b9f068a8805be8de60d38b1cac071d149c4

Provides weights and configuration for the pretrained transformer model bert-base-uncased, published by Google Research. The package uses HuggingFace's transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Feature Description
Name en_trf_bertbaseuncased_lg
Version 2.2.0
spaCy >=2.2.1
Model size 387 MB
Pipeline  sentencizer, trf_wordpiecer, trf_tok2vec
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources bert-base-uncased (Google Research)
License MIT
Author Google Research (repackaged by Explosion)

Requires the spacy-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.


pip install spacy
python -m spacy download en_trf_bertbaseuncased_lg
Assets 3
You can’t perform that action at this time.