en_pytt_robertabase_lg-2.1.0
ines
released this
27 Aug 09:52
·
1476 commits
to master
since this release
Details: https://spacy.io/models/en#en_pytt_robertabase_lg
File checksum:
af149def9d65ac3580b075b65b7fbfcb9560a99a33cbb38be17f4052e9c277b5
Provides weights and configuration for the pretrained transformer model roberta-base
, published by Facebook. The package uses HuggingFace's pytorch-transformers
implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
Requires the spacy-pytorch-transformers
package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
Feature | Description |
---|---|
Name | en_pytt_robertabase_lg |
Version | 2.1.0 |
spaCy | >=2.1.7 |
Model size | 292 MB |
Pipeline | sentencizer , pytt_wordpiecer , pytt_tok2vec |
Sources | roberta-base |
License | MIT |
Author | Facebook (repackaged by Explosion) |
Installation
pip install spacy
pip install spacy-pytorch-transformers
spacy download en_pytt_robertabase_lg