Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PermanentlyFailedException: Timed out waiting for service to come alive. Part3 #245

Closed
skmalviya opened this issue Apr 13, 2020 · 32 comments

Comments

@skmalviya
Copy link

Hi! I know this is similar to #52 and #91 but I am unable to understand how that was solved.

When I run it on the commandline (Ubuntu : Ubuntu 16.04.6 LTS), it runs with success as below:

java -Xmx16G -cp "/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-34d0c1fe4d724a56.props -preload tokenize,ssplit,pos,lemma,ner

[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP -     Threads: 5
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[main] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.6 sec].
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [1.2 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0.5 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0.7 sec].
[main] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1.
[main] INFO edu.stanford.nlp.time.TimeExpressionExtractorImpl - Using following SUTime rules: edu/stanford/nlp/models/sutime/defs.sutime.txt,edu/stanford/nlp/models/sutime/english.sutime.txt,edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 580704 unique entries out of 581863 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_caseless.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 4869 unique entries out of 4869 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_cased.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 585573 unique entries from 2 files
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:9000

But when I run it with python script, it fail with error as below:


import os
os.environ["CORENLP_HOME"] = '/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05'

# Import client module
from stanza.server import CoreNLPClient


client = CoreNLPClient(be_quite=False, classpath='"/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/*"', annotators=['tokenize','ssplit', 'pos', 'lemma', 'ner'], memory='16G', endpoint='http://localhost:9000')
print(client)

client.start()
#import time; time.sleep(10)

text = "Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity."
print (text)
document = client.annotate(text)
print ('malviya')
print(type(document))

Error:

<stanza.server.client.CoreNLPClient object at 0x7fd296e40d68>
Starting server with command: java -Xmx4G -cp "/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05"/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-9a4ccb63339146d0.props -preload tokenize,ssplit,pos,lemma,ner
Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity.

Traceback (most recent call last):
  File "stanza_eng.py", line 18, in <module>
    document = client.annotate(text)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 431, in annotate
    r = self._request(text.encode('utf-8'), request_properties, **kwargs)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 342, in _request
    self.ensure_alive()
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 161, in ensure_alive
    raise PermanentlyFailedException("Timed out waiting for service to come alive.")
stanza.server.client.PermanentlyFailedException: Timed out waiting for service to come alive.

Python 3.6.10
asn1crypto==1.3.0
certifi==2020.4.5.1
cffi==1.14.0
chardet==3.0.4
cryptography==2.8
embeddings==0.0.8
gast==0.2.2
idna==2.9
numpy==1.18.2
protobuf==3.11.3
pycparser==2.20
pyOpenSSL==19.1.0
PySocks==1.7.1
requests==2.23.0
six==1.14.0
stanza==1.0.0
torch==1.4.0
tqdm==4.44.1
urllib3==1.25.8
vocab==0.0.4

I am unable to understand the issue here...

@yuhaozhang
Copy link
Member

Can you try to remove the double quotation marks in the classpath argument and try again? In other words, try to use the following when calling CoreNLPClient:

classpath='/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/*'

@yuhaozhang
Copy link
Member

In fact, since you have already set the CORENLP_HOME environment variable in the script, you can just skip the classpath argument, since CoreNLPClient will by default use CORENLP_HOME as the classpath. Also try skipping the classpath argument and see if it fixes it for you.

@skmalviya
Copy link
Author

I changed the code as suggested :


import os
os.environ["CORENLP_HOME"] = '/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05'

# Import client module
from stanza.server import CoreNLPClient


client = CoreNLPClient(be_quite=False, annotators=['tokenize','ssplit', 'pos', 'lemma', 'ner'], memory='16G', endpoint='http://localhost:9000')
print(client)

client.start()
#import time; time.sleep(10)

text = "Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity."
print (text)
document = client.annotate(text)
print ('malviya')
print(type(document))

But error still persists:
Error:

<stanza.server.client.CoreNLPClient object at 0x7f4b70b23e80>
Starting server with command: java -Xmx16G -cp /home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-7a3beb5da9fb4b5b.props -preload tokenize,ssplit,pos,lemma,ner
Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity.
Traceback (most recent call last):
  File "stanza_eng.py", line 17, in <module>
    document = client.annotate(text)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 431, in annotate
    r = self._request(text.encode('utf-8'), request_properties, **kwargs)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 342, in _request
    self.ensure_alive()
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 161, in ensure_alive
    raise PermanentlyFailedException("Timed out waiting for service to come alive.")
stanza.server.client.PermanentlyFailedException: Timed out waiting for service to come alive.

And when I use terminal to run the java command:

java -Xmx16G -cp /home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-7a3beb5da9fb4b5b.props -preload tokenize,ssplit,pos,lemma,ner

The error comes again which did not come before on its terminal use:

Error: Could not find or load main class .home.naive.Documents.shrikant.Dialogue_Implement.DST.stanford-corenlp-full-2018-10-05.corenlp.sh

But when I run with classpath in quotes, it runs again sucessfully:

java -Xmx16G -cp "/home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet True -serverProperties corenlp_server-7a3beb5da9fb4b5b.props -preload tokenize,ssplit,pos,lemma,ner

Run with success:

[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP -     Threads: 5
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[main] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.5 sec].
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [1.1 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0.4 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0.6 sec].
[main] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1.
[main] INFO edu.stanford.nlp.time.TimeExpressionExtractorImpl - Using following SUTime rules: edu/stanford/nlp/models/sutime/defs.sutime.txt,edu/stanford/nlp/models/sutime/english.sutime.txt,edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 580704 unique entries out of 581863 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_caseless.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 4869 unique entries out of 4869 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_cased.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 585573 unique entries from 2 files
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:9000

I am stuck with some silly error, Please help.

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Apr 14, 2020 via email

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Apr 14, 2020 via email

@skmalviya
Copy link
Author

Oh, troublesome typos! I corrected it and ran again with success this time :)
But in place of printing the output, it spitting out error after 15-20 seconds of starting the server:

(torch_gpu36) naive@silp30:~/Documents/shrikant/Dialogue_Implement/DST$ python stanza_eng.py 
<stanza.server.client.CoreNLPClient object at 0x7f556704ef60>
Starting server with command: java -Xmx16G -cp /home/naive/Documents/shrikant/Dialogue_Implement/DST/stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 60000 -threads 5 -maxCharLength 100000 -quiet False -serverProperties corenlp_server-f8b8119e9b084115.props -preload tokenize,ssplit,pos,lemma,ner
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP -     Threads: 5
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[main] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.5 sec].
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [1.1 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0.4 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0.6 sec].
[main] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1.
[main] INFO edu.stanford.nlp.time.TimeExpressionExtractorImpl - Using following SUTime rules: edu/stanford/nlp/models/sutime/defs.sutime.txt,edu/stanford/nlp/models/sutime/english.sutime.txt,edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 580704 unique entries out of 581863 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_caseless.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 4869 unique entries out of 4869 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_cased.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 585573 unique entries from 2 files
Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity.
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:9000
Traceback (most recent call last):
  File "stanza_eng.py", line 17, in <module>
    document = client.annotate(text)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 431, in annotate
    r = self._request(text.encode('utf-8'), request_properties, **kwargs)
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 342, in _request
    self.ensure_alive()
  File "/home/naive/.conda/envs/torch_gpu36/lib/python3.6/site-packages/stanza/server/client.py", line 161, in ensure_alive
    raise PermanentlyFailedException("Timed out waiting for service to come alive.")
stanza.server.client.PermanentlyFailedException: Timed out waiting for service to come alive.

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Apr 14, 2020 via email

@skmalviya
Copy link
Author

What would be its suitable value in order to resolve the issue, as I set the -timeout 1500000. But still the same error, I am getting...

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Apr 14, 2020 via email

@skmalviya
Copy link
Author

I have sufficient memory, working on a server with 256 GB RAM. Following sys.conf:

(torch_gpu36) naive@silp30:~/Documents/shrikant/Dialogue_Implement/DST$ lscpu
Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                16
On-line CPU(s) list:   0-15
Thread(s) per core:    2
Core(s) per socket:    8
Socket(s):             1
NUMA node(s):          1
Vendor ID:             GenuineIntel
CPU family:            6
Model:                 63
Model name:            Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz
Stepping:              2
CPU MHz:               2397.214
BogoMIPS:              4794.42
Virtualization:        VT-x
L1d cache:             32K
L1i cache:             32K
L2 cache:              256K
L3 cache:              20480K
NUMA node0 CPU(s):     0-15

The problem is very silly, I know that. But I unable to see it.

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Apr 14, 2020 via email

@skmalviya
Copy link
Author

Maybe, the problem was due to running 'python stanza_en.py' remotely on the server.
When I ran the same code with the same stanford 'stanford-corenlp-full-2018-10-05' library, it ran with success as following:

shrikant@Term016:~$ python3 stanza_eng.py 
<stanza.server.client.CoreNLPClient object at 0x7f87c44f9828>
Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity.
Starting server with command: java -Xmx16G -cp /home/shrikant/stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 150000000 -threads 5 -maxCharLength 100000 -quiet False -serverProperties corenlp_server-6861bb92bb5e4036.props -preload tokenize,ssplit,pos,lemma,ner
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP -     Threads: 5
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[main] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.7 sec].
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [1.1 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0.5 sec].
[main] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0.6 sec].
[main] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1.
[main] INFO edu.stanford.nlp.time.TimeExpressionExtractorImpl - Using following SUTime rules: edu/stanford/nlp/models/sutime/defs.sutime.txt,edu/stanford/nlp/models/sutime/english.sutime.txt,edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 580704 unique entries out of 581863 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_caseless.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 4869 unique entries out of 4869 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_cased.tab, 0 TokensRegex patterns.
[main] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 585573 unique entries from 2 files
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:9000
[pool-1-thread-3] INFO CoreNLP - [/127.0.0.1:54034] API call w/annotators tokenize,ssplit,pos,lemma,ner
Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity.
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.protobuf.UnsafeUtil (file:/home/shrikant/stanford-corenlp-full-2018-10-05/protobuf.jar) to field java.nio.Buffer.address
WARNING: Please consider reporting this to the maintainers of com.google.protobuf.UnsafeUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
malviya
<class 'CoreNLP_pb2.Document'>

I do not know the exact reason. It was really difficult to assess & debug. Thank you all for being on my side.

@qipeng
Copy link
Collaborator

qipeng commented Apr 21, 2020

@skmalviya do you have sufficient permissions on that server to bind processes to localhost ports? If not, then unfortunately the CoreNLP server won't start properly.

@LiuYuLOL
Copy link

@skmalviya do you have sufficient permissions on that server to bind processes to localhost ports? If not, then unfortunately the CoreNLP server won't start properly.

Good suggestion! Any idea to see my permissions on the server?

@touhi99
Copy link

touhi99 commented Apr 30, 2020

I also have the same problem while I am trying a sample snippet from the tutorial. I tweaked with different values for the parameter, but it is still same timeout problem. I do have permission to bind localhost port also. (I used other python wrapper for CoreNLP before).

Here is my code snippet:

from stanza.server import CoreNLPClient

with CoreNLPClient(annotators=['parse'], timeout=30000, memory='4G', classpath='stanford-corenlp-full-2020-04-20') as client:
    ann = client.annotate("Chris Manning is a nice person. Chris wrote a simple sentence. He also gives oranges to people.")
    sentence = ann.sentence[0]
    print(sentence) 

Here is the error:

 ann = client.annotate("Chris Manning is a nice person. Chris wrote a simple sentence. He also gives oranges to people.")
  File "/home/alamtl/Desktop/temptagger/myvenv/lib/python3.6/site-packages/stanza/server/client.py", line 470, in annotate
    r = self._request(text.encode('utf-8'), request_properties, **kwargs)
  File "/home/alamtl/Desktop/temptagger/myvenv/lib/python3.6/site-packages/stanza/server/client.py", line 379, in _request
    self.ensure_alive()
  File "/home/alamtl/Desktop/temptagger/myvenv/lib/python3.6/site-packages/stanza/server/client.py", line 203, in ensure_alive
    raise PermanentlyFailedException("Timed out waiting for service to come alive.")
stanza.server.client.PermanentlyFailedException: Timed out waiting for service to come alive.

@AngledLuffa
Copy link
Collaborator

@LiuYuLOL you can check whether or not it is possible to bind services in a couple ways. One would be to start a flask service within python. Another would be to run StanfordCoreNLPServer at a command line (the python process will give you the command line) and direct your browser to the appropriate localhost port.

@AngledLuffa
Copy link
Collaborator

@TAPOS12 we need to know more about your configuration, such as OS, whether you are running from inside jupyter, etc

@LiuYuLOL
Copy link

LiuYuLOL commented May 4, 2020

@LiuYuLOL you can check whether or not it is possible to bind services in a couple ways. One would be to start a flask service within python. Another would be to run StanfordCoreNLPServer at a command line (the python process will give you the command line) and direct your browser to the appropriate localhost port.

Thanks for your reply. Could you explain more about the first one, the flask one?

Besides, the command line (given by the python process) cannot run and got the ``Error: Could not find or load main class''. However, I can run the java server directly with the command

java -mx4g -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 15000

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented May 4, 2020 via email

@LiuYuLOL
Copy link

LiuYuLOL commented May 4, 2020

Yes, this java error is about CLASSPATH, however, the python command is already set with -cp. Below is the command: java -Xmx5G -cp /home/uqyliu42/resources/stanfordnlp_resources/stanford-corenlp-full-2018-10-05/* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 8900 -timeout 30000 -threads 5 -maxCharLength 100000 -quiet False -serverProperties corenlp_server-9786450377a548c7.props -preload tokenize,ssplit,pos,lemma,ner,parse,depparse,dcoref

If run it directly,

Error: Could not find or load main class .home.uqyliu42.resources.stanfordnlp_resources.stanford-corenlp-full-2018-10-05.corenlp.sh

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented May 4, 2020 via email

@RishabhMaheshwary
Copy link

RishabhMaheshwary commented May 4, 2020

I am also recieving the same issue. When I start the server with java command (path enclosed in double quotes) than the server starts and it logs StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:8088.
But when I use it from python script it gives the
PermanentlyFailedException: Timed out error.
I am running it on the ubuntu server
EDIT:

Solved it using export NO_PROXY='localhost' and also export no_proxy='localhost'

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented May 4, 2020 via email

@RishabhMaheshwary
Copy link

Actually I was running it on ubuntu server, so I was having no GUI.
I found that the issue was with our college proxy, so I used the NO_PROXY method.
I did not try start_server=False method.

@skmalviya
Copy link
Author

skmalviya commented May 12, 2020

@skmalviya do you have sufficient permissions on that server to bind processes to localhost ports? If not, then unfortunately the CoreNLP server won't start properly.

@qipeng On your suggestion, I tried it with sudo this time and Guess it ran with success. Thanks for the advice. But still one doubt, on my system, the same program ran sucessfully even without sudo, but on the server, it was unsuccessful until ran it with the sudo permission.

The comment, by @AngledLuffa, could be followed to explore it further.

@skmalviya
Copy link
Author

skmalviya commented May 12, 2020

How I sorted out the things,

1. When running on my system locally (Checked on python3.6.9):

* install stanza==1.0.1
* Download & Extract **stanford-corenlp-full-2018-10-05.zip**
* run the following python script with proper **CORENLP_HOME** path:

import os
os.environ["CORENLP_HOME"] = '/home/shrikant/stanford-corenlp-full-2018-10-05'

# Import client module
from stanza.server import CoreNLPClient


client = CoreNLPClient(timeout=150000000, be_quiet=False, annotators=['tokenize','ssplit', 'pos', 'lemma', 'ner'], memory='16G', endpoint='http://localhost:9000')
print(client)

#client.start()
#import time; time.sleep(10)

text = "Albert Einstein was a German-born theoretical physicist. He developed the theory of relativity."
print (text)
document = client.annotate(text)
print ('malviya')
print(type(document))

2. When running on a remote server (Checked on python3.6.9):

* install stanza==1.0.1
* Download & Extract **stanford-corenlp-full-2018-10-05.zip**
* Run the same python script with proper **CORENLP_HOME** path but **WITH SUDO** now.

@LiuYuLOL
Copy link

I solved the issue by disabling the proxy env variables:

unset http_proxy
unset https_proxy

My university uses a different proxy and thus cannot visit localhost, which leads the time-out issue.

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented May 12, 2020 via email

@semal
Copy link

semal commented Sep 17, 2020

Maybe a bug from self.ensure_alive(),remove this code line should work.

@AngledLuffa
Copy link
Collaborator

AngledLuffa commented Sep 17, 2020 via email

@dsivakumar
Copy link

dsivakumar commented Jul 2, 2022

There are two problems I found in my env
First, without realizing my java runtime was 32bit, I was trying with 1,2,3G. 1G was accepted and that wasn't enough, for CoreNLP.
Two, takes too much time to startup and resulting in timeout!

raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=9000): Read timed out. (read timeout=120.0)

Solution

Uninstalled 32bit runtime and installed 64bit, then started the server outside the python code,
java -Xmx4G -cp C:\Users\Siva\stanza_corenlp\* edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 600000 -threads 2 -maxCharLength 100000 -quiet False -preload -outputFormat serialized

And use local server like this

client = CoreNLPClient(annotators='tokenize,ssplit,pos,lemma,ner,depparse',
            start_server=corenlp.StartServer.DONT_START) ###
OR
client = CoreNLPClient('http://localhost:9000') 

It worked, one check in the browser localhost:9000 if all works well run the python code

Also, java need Windows permission for port access, don't ignore the popup!

@git-the-language-nerd
Copy link

I am also recieving the same issue. When I start the server with java command (path enclosed in double quotes) than the server starts and it logs StanfordCoreNLPServer listening at /0:0:0:0:0:0:0:0:8088. But when I use it from python script it gives the PermanentlyFailedException: Timed out error. I am running it on the ubuntu server EDIT:

Solved it using export NO_PROXY='localhost' and also export no_proxy='localhost'

This did it for me, i.e.

os.environ['NO_PROXY'] = 'localhost'
os.environ['no_proxy'] = 'localhost'

at the top of my notebook

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants