en_pytt_distilbertbaseuncased_lg-2.1.0
·
1473 commits
to master
since this release
Details: https://spacy.io/models/en#en_pytt_distilbertbaseuncased_lg
File checksum:
ce1721a5bc849d23d9af5e4cdfcc041bb801349f025c2534ac9af8883e29e423
Provides weights and configuration for the pretrained transformer model distilbert-base-uncased
, published by Hugging Face. The package uses Hugging Face's pytorch-transformers
implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
Requires the spacy-pytorch-transformers
package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
Feature | Description |
---|---|
Name | en_pytt_distilbertbaseuncased_lg |
Version | 2.1.0 |
spaCy | >=2.1.7 |
Model size | 245 MB |
Pipeline | sentencizer , pytt_wordpiecer , pytt_tok2vec |
Sources | distilbert-base-uncased |
License | MIT |
Author | Hugging Face (repackaged by Explosion) |
Installation
pip install spacy
pip install spacy-pytorch-transformers
spacy download en_pytt_distilbertbaseuncased_lg