Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-10330] Add Scalastyle rule to require use of SparkHadoopUtil JobContext methods #8521

Closed
wants to merge 3 commits into from

Conversation

JoshRosen
Copy link
Contributor

This is a followup to #8499 which adds a Scalastyle rule to mandate the use of SparkHadoopUtil's JobContext accessor methods and fixes the existing violations.

@SparkQA
Copy link

SparkQA commented Aug 29, 2015

Test build #41788 has finished for PR 8521 at commit 58dd847.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • public class JavaTrainValidationSplitExample

@JoshRosen
Copy link
Contributor Author

Jenkins, retest this please.

@JoshRosen
Copy link
Contributor Author

(Re-testing pre-emptively in case of flakiness)

@SparkQA
Copy link

SparkQA commented Aug 30, 2015

Test build #41789 has finished for PR 8521 at commit d9cb6df.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 30, 2015

Test build #41792 has finished for PR 8521 at commit d9cb6df.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • public class JavaTrainValidationSplitExample
    • class KMeans @Since("1.5.0") (

@@ -858,7 +858,7 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli
// Use setInputPaths so that wholeTextFiles aligns with hadoopFile/textFile in taking
// comma separated files as input. (see SPARK-7155)
NewFileInputFormat.setInputPaths(job, path)
val updateConf = job.getConfiguration
val updateConf = SparkHadoopUtil.get.getConfigurationFromJobContext(job)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JoshRosen I didn't change job.getConfiguration last time because they uses org.apache.hadoop.mapreduce.Job whose getConfiguration is compatible. However, +1 for this change since it is good for maintenance.

@JoshRosen
Copy link
Contributor Author

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42179 has finished for PR 8521 at commit d9cb6df.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@JoshRosen
Copy link
Contributor Author

Since this still passes tests, can we merge this now?

@JoshRosen
Copy link
Contributor Author

Jenkins, retest this please.

@JoshRosen
Copy link
Contributor Author

Going to retest and will merge once tests pass again.

@SparkQA
Copy link

SparkQA commented Sep 11, 2015

Test build #42296 has finished for PR 8521 at commit d9cb6df.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@marmbrus
Copy link
Contributor

LGTM

@JoshRosen
Copy link
Contributor Author

Jenkins, retest this please.

@JoshRosen
Copy link
Contributor Author

Going to merge this into master.

@asfgit asfgit closed this in b3a7480 Sep 12, 2015
@JoshRosen JoshRosen deleted the SPARK-10330-part2 branch September 12, 2015 23:26
@SparkQA
Copy link

SparkQA commented Sep 13, 2015

Test build #42377 has finished for PR 8521 at commit d9cb6df.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants