Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-3562]Periodic cleanup event logs #4214

Closed
wants to merge 8 commits into from
Closed

[SPARK-3562]Periodic cleanup event logs #4214

wants to merge 8 commits into from

Conversation

viper-kun
Copy link
Contributor

No description provided.

conf.getInt("spark.history.updateInterval", 10)) * 1000
private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
.orElse(conf.getOption("spark.history.fs.updateInterval")
.map(warnUpdateInterval("spark.history.fs.updateInterval", _)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just for neatness, I'd have a single method that does both conf.getOption() and .map():

private def getDeprecatedConfig(conf: SparkConf, key: String): Option[String]

Then just do:

conf.getOption(newKey).orElse(getDeprecatedKey(old1)).orElse(getDeprecatedKey(old2))

@vanzin
Copy link
Contributor

vanzin commented Jan 27, 2015

LGTM aside from some very minor things.

@@ -163,9 +179,6 @@ private[history] class FsHistoryProvider(conf: SparkConf) extends ApplicationHis
* applications that haven't been updated since last time the logs were checked.
*/
private[history] def checkForLogs(): Unit = {
lastLogCheckTimeMs = getMonotonicTimeMs()
logDebug("Checking for logs. Time is now %d.".format(lastLogCheckTimeMs))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are you removing the logging?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. The tests don't expect anything to be happening in the background, instead they rely on being able to trigger these updates explicitly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

checkForlogs thread is Scheduled by pool. lastLogCheckTimeMs is no use and remove it.

@andrewor14
Copy link
Contributor

ok to test


private def getDeprecatedConfig(conf: SparkConf, key: String): Option[String] = {
conf.getOption(key).map(warnUpdateInterval(key, _))
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you do all of these through SparkConf.deprecatedConfigs instead of doing it here? You may need to rebase to master to get those changes. (I filed SPARK-5931 for this purpose, and I hope to eventually move all the deprecation logic there).

@andrewor14
Copy link
Contributor

@viper-kun Thanks for re-opening this, and my apologies for having neglected the older patch. I left a few comments inline, most of which are minor ones that have to do with naming and docs. Otherwise this patch looks fairly straightforward and I would like to get it merged soon.

@SparkQA
Copy link

SparkQA commented Feb 20, 2015

Test build #27791 has finished for PR 4214 at commit 70c28d6.

  • This patch fails Spark unit tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 25, 2015

Test build #27947 has finished for PR 4214 at commit 6e3d06b.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viper-kun
Copy link
Contributor Author

@andrewor14 thanks for your check. Pls retest it . I can not get test log.

@SparkQA
Copy link

SparkQA commented Feb 26, 2015

Test build #27974 has finished for PR 4214 at commit 31674ee.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 26, 2015

Test build #27988 has finished for PR 4214 at commit 7a5b9c5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

conf.getInt("spark.history.updateInterval", 10)) * 1000
private val UPDATE_INTERVAL_MS = conf.getOption("spark.history.fs.update.interval.seconds")
.orElse(conf.getOption(SparkConf.translateConfKey("spark.history.fs.updateInterval", true)))
.orElse(conf.getOption(SparkConf.translateConfKey("spark.history.updateInterval", true)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't call translateConfKey here, but I realize we need to do so if we want to warn the user. I will submit a separate patch to fix this behavior. In general I think the translateConfKey method should be private to SparkConf.

@andrewor14
Copy link
Contributor

LGTM I'm merging this into master and I will file a new issue for the conf key issue I mentioned. Thanks for your patience @viper-kun!

@asfgit asfgit closed this in 8942b52 Feb 26, 2015
@viper-kun viper-kun deleted the cleaneventlog branch January 18, 2017 09:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants