Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nlp.pipe(n_processes=x) does not work with doc._.language['language'] #4

Open
donnydongchen opened this issue Apr 19, 2020 · 0 comments

Comments

@donnydongchen
Copy link

donnydongchen commented Apr 19, 2020

Hello there,

I'm new to the spaCy universe. I've encountered this issue but it could be something I did wrong.

How to reproduce the behavior

import spacy
from spacy_langdetect import LanguageDetector

nlp = spacy.load('en_core_web_sm')
language_detector = LanguageDetector()
nlp.add_pipe(nlp.create_pipe('sentencizer'))
nlp.add_pipe(language_detector, name='language_detector', last=True)

text = ['I like bananas.', 'Do you like them?', 'No, I prefer wasabi.']
# works without 'n_process' 
for doc in nlp.pipe(text,             
                    disable=["parser"],
                    batch_size=250,
                    n_process=4  # keep taggern_process=2
                   ):
    print(doc._.language['language'])

Error

AttributeError: [E046] Can't retrieve unregistered extension attribute 'language'. Did you forget to call the `set_extension` method?

Environment

  • spaCy version: 2.2.4
  • Python version: 3.7.6

Possible Relevant Issues I found (non-extension)

explosion/spaCy#4903
explosion/spaCy#4737

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant