Investigate Heap Requirements for Smarti #147
Comments
* added a configuration that allows to configure the executor service pool size for processing. The default is set to 2 as requested by #145 solves #147 * changed configuration for the Stanford NLP processing to use the Shift Reduce Parser as this one has a lower memory footprint * added nlp.stanfordnlp.de.parseMaxlen=40 to prevent parse tree generation for long sentences that could end up in OOM situations NOTE: Those changes depend on bug-fixes in redlink-nlp and an update to Stanford NLP 3.8.0
The investigation concluded that their are no memory leaks present. Even long processing runs of The OOM errors could be traced down to messages containing long sentences (or other Strings e.g. ASKII graphics) that cause the Stanford NLP Parser to require huge amounts of memory. Several solutions where tested with the following results:
With this configuration no OOM errors where encountered. Only when processing ASKII Graphics the PCFG Parser was driving the System to its limits while causes a lot of GC overhead. After processing the System recovered without problems and continued normal. NOTE: Those changes require the newest SNAPSHOT version of |
at least very cool that we have knowledge in that depth. |
@westei Can you please write a short doc on how you run the stress test for the ressource consumption behavior of Smarti? |
created #179 for the documentation |
When running with
-Xmx4g
ajava.lang.OutOfMemoryError: Java heap space
was encountered.As this is unexpected we need to further investigate memory consumption of Smarti. This include
NOTE: marking this as enhancement with the intention to create additional issues based on investigation results
The text was updated successfully, but these errors were encountered: