You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the library, it is really great and useful. However, sometime I get the exception from title.
It happens when I use new CoreNLPProcessor(withDiscourse = true). Without this flag, memory consuming is not so big, but still suspicious.
So, I cannot run withDiscourse even on my working station, and on production (we use several cheap AWS instances) I occasionally get OOM even without withDiscourse. Sometime even at sbt compile stage.
I create that new CoreNLPProcessor in companion-object and call it from appropriating class. I use it to annotate input text and to do some analysis with syntacticTree and dependencies, so I think there are no leaks in my code.
I'd appreciate your help in resolving this issue: how can I reduce memory usage. Thanks.
The text was updated successfully, but these errors were encountered:
One option to save some memory is to create CoreNLPProcessor with internStrings = false. By default, our processors "intern" Java Strings, which saves memory when you process long documents, BUT it keeps memory growing if you keep your program running for a long time.
In any case, unfortunately this package needs a good chunk of RAM to run. I recommend at least 3G. 6G is sufficient in my experience for anything you might parse.
Hi,
Thanks for the library, it is really great and useful. However, sometime I get the exception from title.
It happens when I use
new CoreNLPProcessor(withDiscourse = true)
. Without this flag, memory consuming is not so big, but still suspicious.So, I cannot run
withDiscourse
even on my working station, and on production (we use several cheap AWS instances) I occasionally get OOM even withoutwithDiscourse
. Sometime even atsbt compile
stage.I create that
new CoreNLPProcessor
in companion-object and call it from appropriating class. I use it to annotate input text and to do some analysis withsyntacticTree
anddependencies
, so I think there are no leaks in my code.I'd appreciate your help in resolving this issue: how can I reduce memory usage. Thanks.
The text was updated successfully, but these errors were encountered: