-
Notifications
You must be signed in to change notification settings - Fork 118
[HIVEMALL-30] Increase maximum Java heap size and decrease MaxPermGen to avoid OutOfMemoryError #22
Conversation
@maropu Could you take a look at? |
TravisCI always pass these tests though, why did you got the error? What's your environment? |
Oh, I got you. So, how about setting these values along with the spark one (e.g., -mx3g)? See: https://github.com/apache/spark/blob/master/pom.xml#L2066 |
okay, LGTM cc: @myui |
@maropu BTW, why |
@myui In running the test, a spark context and unit tests work in the same JVM. The context uses at most 1g memory by default. So, in some conditions, the unit tests run short of memory, I think. If we set 3g at the value, the context uses 1g and the tests do 2g. That sounds fine to me. |
@maropu It seems full GC is still happening for some cases...
|
@myui How about 512m? Sorry, but I forgot why I set that value there though, I checked the spark configure and I found that value is 512m. |
I'll try to set 512m. |
1 similar comment
okay, thanks! |
What changes were proposed in this pull request?
Increase
maximum Java heap size
and decreaseMaxPermGen
to avoidOutOfMemoryError
.How was this patch tested?
Manual tests