Skip to content

Commit

Permalink
[SPARK-30060][CORE] Rename metrics enable/disable configs
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?
This proposes to introduce a naming convention for Spark metrics configuration parameters used to enable/disable metrics source reporting using the Dropwizard metrics library:   `spark.metrics.sourceNameCamelCase.enabled` and update 2 parameters to use this naming convention.

### Why are the changes needed?
Currently Spark has a few parameters to enable/disable metrics reporting. Their naming pattern is not uniform and this can create confusion.  Currently we have:
`spark.metrics.static.sources.enabled`
`spark.app.status.metrics.enabled`
`spark.sql.streaming.metricsEnabled`

### Does this PR introduce any user-facing change?
Update parameters for enabling/disabling metrics reporting new in Spark 3.0: `spark.metrics.static.sources.enabled` -> `spark.metrics.staticSources.enabled`, `spark.app.status.metrics.enabled`  -> `spark.metrics.appStatusSource.enabled`.
Note: `spark.sql.streaming.metricsEnabled` is left unchanged as it is already in use in Spark 2.x.

### How was this patch tested?
Manually tested

Closes #26692 from LucaCanali/uniformNamingMetricsEnableParameters.

Authored-by: Luca Canali <luca.canali@cern.ch>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
LucaCanali authored and dongjoon-hyun committed Dec 3, 2019
1 parent 196ea93 commit 60f20e5
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 11 deletions.
Expand Up @@ -55,8 +55,8 @@ private[spark] object Status {
.intConf
.createWithDefault(Int.MaxValue)

val APP_STATUS_METRICS_ENABLED =
ConfigBuilder("spark.app.status.metrics.enabled")
val METRICS_APP_STATUS_SOURCE_ENABLED =
ConfigBuilder("spark.metrics.appStatusSource.enabled")
.doc("Whether Dropwizard/Codahale metrics " +
"will be reported for the status of the running spark app.")
.booleanConf
Expand Down
Expand Up @@ -638,7 +638,7 @@ package object config {
.createOptional

private[spark] val METRICS_STATIC_SOURCES_ENABLED =
ConfigBuilder("spark.metrics.static.sources.enabled")
ConfigBuilder("spark.metrics.staticSources.enabled")
.doc("Whether to register static sources with the metrics system.")
.booleanConf
.createWithDefault(true)
Expand Down
Expand Up @@ -22,7 +22,7 @@ import AppStatusSource.getCounter
import com.codahale.metrics.{Counter, Gauge, MetricRegistry}

import org.apache.spark.SparkConf
import org.apache.spark.internal.config.Status.APP_STATUS_METRICS_ENABLED
import org.apache.spark.internal.config.Status.METRICS_APP_STATUS_SOURCE_ENABLED
import org.apache.spark.metrics.source.Source

private [spark] class JobDuration(val value: AtomicLong) extends Gauge[Long] {
Expand Down Expand Up @@ -71,7 +71,7 @@ private[spark] object AppStatusSource {
}

def createSource(conf: SparkConf): Option[AppStatusSource] = {
Option(conf.get(APP_STATUS_METRICS_ENABLED))
Option(conf.get(METRICS_APP_STATUS_SOURCE_ENABLED))
.filter(identity)
.map { _ => new AppStatusSource() }
}
Expand Down
12 changes: 6 additions & 6 deletions docs/monitoring.md
Expand Up @@ -924,7 +924,7 @@ This is the component with the largest amount of instrumented metrics

- namespace=HiveExternalCatalog
- **note:**: these metrics are conditional to a configuration parameter:
`spark.metrics.static.sources.enabled` (default is true)
`spark.metrics.staticSources.enabled` (default is true)
- fileCacheHits.count
- filesDiscovered.count
- hiveClientCalls.count
Expand All @@ -933,7 +933,7 @@ This is the component with the largest amount of instrumented metrics

- namespace=CodeGenerator
- **note:**: these metrics are conditional to a configuration parameter:
`spark.metrics.static.sources.enabled` (default is true)
`spark.metrics.staticSources.enabled` (default is true)
- compilationTime (histogram)
- generatedClassSize (histogram)
- generatedMethodSize (histogram)
Expand Down Expand Up @@ -962,8 +962,8 @@ This is the component with the largest amount of instrumented metrics
- queue.executorManagement.listenerProcessingTime (timer)

- namespace=appStatus (all metrics of type=counter)
- **note:** Introduced in Spark 3.0. Conditional to configuration parameter:
`spark.app.status.metrics.enabled=true` (default is false)
- **note:** Introduced in Spark 3.0. Conditional to a configuration parameter:
`spark.metrics.appStatusSource.enabled` (default is false)
- stages.failedStages.count
- stages.skippedStages.count
- stages.completedStages.count
Expand Down Expand Up @@ -1057,7 +1057,7 @@ when running in local mode.

- namespace=HiveExternalCatalog
- **note:**: these metrics are conditional to a configuration parameter:
`spark.metrics.static.sources.enabled` (default is true)
`spark.metrics.staticSources.enabled` (default is true)
- fileCacheHits.count
- filesDiscovered.count
- hiveClientCalls.count
Expand All @@ -1066,7 +1066,7 @@ when running in local mode.

- namespace=CodeGenerator
- **note:**: these metrics are conditional to a configuration parameter:
`spark.metrics.static.sources.enabled` (default is true)
`spark.metrics.staticSources.enabled` (default is true)
- compilationTime (histogram)
- generatedClassSize (histogram)
- generatedMethodSize (histogram)
Expand Down

0 comments on commit 60f20e5

Please sign in to comment.