Skip to content

@ines ines released this Aug 2, 2019 · 17 commits to master since this release


File checksum: e291eea1f46438754b787774769155d1a42da7c7e33cd4e85589d7eef723a183

Provides weights and configuration for the pretrained transformer model xlnet-base-cased, published by CMU & Google Brain. The package uses HuggingFace's pytorch-transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Requires the spacy-pytorch-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.

Feature Description
Name en_pytt_xlnetbasecased_lg
Version 2.1.0
spaCy >=2.1.7
Model size 434 MB
Pipeline sentencizer, pytt_wordpiecer, pytt_tok2vec
Sources xlnet-base-cased
License MIT
Author CMU & Google Brain (repackaged by Explosion)


pip install spacy
spacy download en_pytt_xlnetbasecased_lg
Assets 3
You can’t perform that action at this time.