Skip to content

Commit

Permalink
Update helpJvmUsedMemoryHeuristic.scala.html
Browse files Browse the repository at this point in the history
  • Loading branch information
skakker authored and swasti committed Jan 8, 2018
1 parent b49026b commit 9f8bdf2
Showing 1 changed file with 2 additions and 12 deletions.
14 changes: 2 additions & 12 deletions app/views/help/spark/helpJvmUsedMemoryHeuristic.scala.html
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,6 @@
*@
<p>This is a heuristic for peak JVM used memory.</p>
<h4>Executor Max Peak JVM Used Memory</h4>
<p>This is to analyse whether the executor memory is set to a good value. To avoid wasted memory, it checks if the peak JVM used memory is reasonably close to the allocated executor memory, (spark.executor.memory) -- if it is much smaller, then executor memory should be reduced.</p>
<p>The thresholds set currently are : <br>
Low: spark.executor.memory >= 1.5 * (max peakJvmUsedMemory + 300MB)<br>
Moderate: spark.executor.memory >= 2 * (max peakJvmUsedMemory + 300MB)<br>
Severe: spark.executor.memory >= 2.5 * (max peakJvmUsedMemory + 300MB)<br>
Critical: spark.executor.memory >= 3 * (max peakJvmUsedMemory + 300MB)</p>
<p>This is to analyse whether the executor memory is set to a good value. To avoid wasted memory, it checks if the peak JVM used memory by the executor is reasonably close to the blocked executor memory which is specified in spark.executor.memory. If the peak JVM memory is much smaller, then the executor memory should be reduced.</p>
<h4>Driver Max Peak JVM Used Memory</h4>
<p>Allocated memory for the driver (spark.driver.memory) is examined and it checks if its much more than the peak JVM memory used by the driver</p>
<p>The thresholds set currently are : <br>
Low: spark.driver.memory >= 1.5 * (max peakJvmUsedMemory + 300MB)<br>
Moderate: spark.driver.memory >= 2 * (max peakJvmUsedMemory + 300MB)<br>
Severe: spark.driver.memory >= 2.5 * (max peakJvmUsedMemory + 300MB)<br>
Critical: spark.driver.memory >= 3 * (max peakJvmUsedMemory + 300MB)</p>
<p>This is to analyse whether the driver memory is set to a good value. To avoid wasted memory, it checks if the peak JVM used memory by the driver is reasonably close to the blocked driver memory which is specified in spark.driver.memory. If the peak JVM memory is much smaller, then the driver memory should be reduced.</p>

0 comments on commit 9f8bdf2

Please sign in to comment.