Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-12615] Remove some deprecated APIs in RDD/SparkContext #10569

Closed
wants to merge 3 commits into from

Conversation

rxin
Copy link
Contributor

@rxin rxin commented Jan 4, 2016

I looked at each case individually and it looks like they can all be removed. The only one that I had to think twice was toArray (I even thought about un-deprecating it, until I realized it was a problem in Java to have toArray returning java.util.List).

@@ -144,9 +131,6 @@ abstract class TaskContext extends Serializable {
*/
def attemptNumber(): Int

@deprecated("use attemptNumber", "1.3.0")
def attemptId(): Long
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm so happy to see this go... the old name was the subject of much confusion.

@rxin rxin changed the title [SPARK-12615] Remove some deprecated APIs in RDD/SparkContext. [SPARK-12615] Remove some deprecated APIs in RDD/SparkContext - WIP Jan 4, 2016
@SparkQA
Copy link

SparkQA commented Jan 4, 2016

Test build #48641 has finished for PR 10569 at commit c9f83d5.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Jan 4, 2016

+1. On this note, I am about to submit a change that fixes about 75% of the remaining build warnings, most due to deprecated API usages. Not sure whether it's easier to merge that first since it will remove usages of these methods in the tests and examples that I think are otherwise going to fail.

@SparkQA
Copy link

SparkQA commented Jan 4, 2016

Test build #2313 has started for PR 10569 at commit 5d3b013.

@SparkQA
Copy link

SparkQA commented Jan 5, 2016

Test build #2314 has finished for PR 10569 at commit 5d3b013.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 5, 2016

Test build #2316 has finished for PR 10569 at commit 5d3b013.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -87,7 +87,7 @@ class HiveSparkSubmitSuite
runSparkSubmit(args)
}

test("SPARK-8489: MissingRequirementError during reflection") {
ignore("SPARK-8489: MissingRequirementError during reflection") {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andrewor14 this test case is failing due to binary compatibility.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's file a followup JIRA for this one.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SparkQA
Copy link

SparkQA commented Jan 5, 2016

Test build #48733 has finished for PR 10569 at commit fb676c2.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 5, 2016

Test build #48735 has finished for PR 10569 at commit d91a8d3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@rxin rxin changed the title [SPARK-12615] Remove some deprecated APIs in RDD/SparkContext - WIP [SPARK-12615] Remove some deprecated APIs in RDD/SparkContext Jan 5, 2016
@@ -34,10 +34,6 @@ case class Aggregator[K, V, C] (
mergeValue: (C, V) => C,
mergeCombiners: (C, C) => C) {

@deprecated("use combineValuesByKey with TaskContext argument", "0.9.0")
def combineValuesByKey(iter: Iterator[_ <: Product2[K, V]]): Iterator[(K, C)] =
combineValuesByKey(iter, null)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ha, the old code here doesn't even make sense: why not pass TaskContext.get() instead of null? Glad to see this gone.

@JoshRosen
Copy link
Contributor

LGTM. I'm going to merge this now in order to avoid merge conflicts. Let's file a followup to deal with #10569 (diff), though.

@asfgit asfgit closed this in 8ce645d Jan 5, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants