Skip to content
This repository has been archived by the owner on Sep 20, 2022. It is now read-only.

[HIVEMALL-30] Increase maximum Java heap size and decrease MaxPermGen to avoid OutOfMemoryError #22

Closed
wants to merge 4 commits into from

Conversation

wangyum
Copy link
Member

@wangyum wangyum commented Jan 22, 2017

What changes were proposed in this pull request?

Increase maximum Java heap size and decrease MaxPermGen to avoid OutOfMemoryError.

How was this patch tested?

Manual tests

@coveralls
Copy link

coveralls commented Jan 22, 2017

Coverage Status

Coverage remained the same at 35.842% when pulling 1d18c7d on wangyum:HIVEMALL-30 into ed16ca0 on apache:master.

@wangyum wangyum changed the title [HIVEMALL-30] Increase -Xmx from 1024 to 1536 [HIVEMALL-30] Increase -Xmx to 1536 to avoid OutOfMemoryError Jan 22, 2017
@myui
Copy link
Member

myui commented Jan 22, 2017

@maropu Could you take a look at?

@wangyum wangyum changed the title [HIVEMALL-30] Increase -Xmx to 1536 to avoid OutOfMemoryError [HIVEMALL-30] Increase -Xmx to -Xmx1536m to avoid OutOfMemoryError Jan 23, 2017
@maropu
Copy link
Member

maropu commented Jan 23, 2017

TravisCI always pass these tests though, why did you got the error? What's your environment?

@maropu
Copy link
Member

maropu commented Jan 23, 2017

Oh, I got you. So, how about setting these values along with the spark one (e.g., -mx3g)? See: https://github.com/apache/spark/blob/master/pom.xml#L2066

@maropu
Copy link
Member

maropu commented Jan 23, 2017

okay, LGTM cc: @myui

@myui
Copy link
Member

myui commented Jan 23, 2017

@maropu BTW, why mvn -q scalastyle:check test -Pspark-2.0" exited with 1 is happening?

@maropu
Copy link
Member

maropu commented Jan 23, 2017

@myui In running the test, a spark context and unit tests work in the same JVM. The context uses at most 1g memory by default. So, in some conditions, the unit tests run short of memory, I think. If we set 3g at the value, the context uses 1g and the tests do 2g. That sounds fine to me.

@coveralls
Copy link

coveralls commented Jan 23, 2017

Coverage Status

Coverage remained the same at 35.842% when pulling 37da226 on wangyum:HIVEMALL-30 into ed16ca0 on apache:master.

@myui
Copy link
Member

myui commented Jan 23, 2017

@maropu It seems full GC is still happening for some cases...

HivemallFeatureOpsSuite:
No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself.

MaxPermGen=1024m might be too big and it might need more space for other stuffs. Do we need -MaxPermGen ? Less parameter is better for configuring JVM. We can use just 4GB in TravisCi while there are 3 tasks (openjdk7, oraclejdk8, oraclejdk8) in each job.

@maropu
Copy link
Member

maropu commented Jan 23, 2017

@myui How about 512m? Sorry, but I forgot why I set that value there though, I checked the spark configure and I found that value is 512m.

@wangyum
Copy link
Member Author

wangyum commented Jan 23, 2017

I'll try to set 512m.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 35.842% when pulling 20c2eea on wangyum:HIVEMALL-30 into ed16ca0 on apache:master.

1 similar comment
@coveralls
Copy link

coveralls commented Jan 23, 2017

Coverage Status

Coverage remained the same at 35.842% when pulling 20c2eea on wangyum:HIVEMALL-30 into ed16ca0 on apache:master.

@asfgit asfgit closed this in d95f1e7 Jan 23, 2017
@myui
Copy link
Member

myui commented Jan 23, 2017

@wangyum @maropu Thanks. Merged with some modifications. Configuration for spark-1.6 should also be changed.

@maropu
Copy link
Member

maropu commented Jan 23, 2017

okay, thanks!

@wangyum wangyum changed the title [HIVEMALL-30] Increase -Xmx to -Xmx1536m to avoid OutOfMemoryError [HIVEMALL-30] Increase maximum Java heap size and decrease MaxPermGen to avoid OutOfMemoryError Jan 23, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
5 participants