New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TIMEOUT: Timeout exceeded
error trying tok = CoreNLPTokenizer()
#23
Comments
Hi, This is not how you specify the CLASSPATH in java. You need |
I have done the same only. |
Hm. Are you sure?
|
|
Weird. What shell are you using? bash? Try escaping: |
I am using bash. Above also gives the same result only |
Got the CLASSPATHS corrected. Still the error is occuring. Since those are jar files do I need to setup java for running this?
|
Yes, you need Java 8 |
As a more direct test, try to see if you can get |
I didn't have java installed. Installing it made it work in python. Anyway above code threw an error.
Anyway, it is working now. I think it will be a good idea to add java in to the dependencies list. |
The classpath is working in my case, java test string too - but the problem is: 09/17/2018 01:52:46 PM: [ Running on CPU only. ] During handling of the above exception, another exception occurred: Traceback (most recent call last): logging in through SSH into docker container |
Oy. Which version of CoreNLP and/or pexpect are you using? The NER module of the versions past |
@RitwikGopi
I find I can run in command line:
But when I run a python script as @ajfisch mentioned: from drqa.tokenizers import CoreNLPTokenizer
tok = CoreNLPTokenizer()
tok.tokenize('hello world').words() # Should complete immediately Everything still the same, namely after a long long while, it occurs mistakes as:
|
If you want to add java directly from your jupyter notebook (if you are using google colab/cloud notebook)
|
@ajfisch Can you please guide me how to replace corenlp tokenizer with the spacy in DRQA, i'm having lot of trouble in doing all this work. |
WhenI try
CLASSPATH is set properly
The text was updated successfully, but these errors were encountered: