Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-4397] Move object RDD to the front of RDD.scala. #3580

Closed
wants to merge 4 commits into from

Conversation

rxin
Copy link
Contributor

@rxin rxin commented Dec 3, 2014

I ran into multiple cases that SBT/Scala compiler was confused by the implicits in continuous compilation mode. Adding explicit return types fixes the problem.

I ran into multiple cases that SBT/Scala compiler was confused by the implicits in continuous compilation mode.
Moving them to the beginning of the file seemed to have fixed the problem.
@rxin rxin changed the title Move object RDD to the front of RDD.scala. [SPARK-4397] Move object RDD to the front of RDD.scala. Dec 3, 2014
@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24088 has started for PR 3580 at commit a836a37.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24088 has finished for PR 3580 at commit a836a37.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24088/
Test FAILed.

}

implicit def rddToOrderedRDDFunctions[K : Ordering : ClassTag, V: ClassTag](rdd: RDD[(K, V)])
: OrderedRDDFunctions[K, V] = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be OrderedRDDFunctions[K, V, (K, V)]

@zsxwing
Copy link
Member

zsxwing commented Dec 3, 2014

Moving them to the beginning of the file seemed to have fixed the problem.

Not sure if this is the correct fix. Here is the error information in my machine:

Note: implicit method numericRDDToDoubleRDDFunctions is not applicable here because it comes after the application point and it lacks an explicit result type

After I added the return types to all implicit methods, I could not reproduce this error in continuous compilation mode.

@zsxwing
Copy link
Member

zsxwing commented Dec 3, 2014

Could you also add the return type for writableWritableConverter? It has the similar issue.

  implicit def writableWritableConverter[T <: Writable](): WritableConverter[T] =
    new WritableConverter[T](_.runtimeClass.asInstanceOf[Class[T]], _.asInstanceOf[T])

Conflicts:
	core/src/main/scala/org/apache/spark/SparkContext.scala
@rxin
Copy link
Contributor Author

rxin commented Dec 3, 2014

Alright pushed a new version.

@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24106 has started for PR 3580 at commit b8562c9.

  • This patch merges cleanly.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24104/
Test FAILed.

@rxin
Copy link
Contributor Author

rxin commented Dec 3, 2014

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24111 has started for PR 3580 at commit b8562c9.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24106 has finished for PR 3580 at commit b8562c9.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24106/
Test PASSed.

@SparkQA
Copy link

SparkQA commented Dec 3, 2014

Test build #24111 has finished for PR 3580 at commit b8562c9.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24111/
Test PASSed.

@zsxwing
Copy link
Member

zsxwing commented Dec 4, 2014

Could you test only adding the return types without moving the codes? If adding the return types works, I suggest not moving these codes.

// compatibility and forward to the following functions directly.

implicit def rddToPairRDDFunctions[K, V](rdd: RDD[(K, V)])
(implicit kt: ClassTag[K], vt: ClassTag[V], ord: Ordering[K] = null): PairRDDFunctions[K, V] = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you didn't preserve the indentation down there. was this on purpose?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea 100 char ...

@andrewor14
Copy link
Contributor

LGTM, and +1 especially on adding return types to implicits.

@rxin
Copy link
Contributor Author

rxin commented Dec 4, 2014

I moved it to the end. Looks like the problem is with implicit functions without explicit return types.

cc @pwendell @mateiz @andrewor14 @JoshRosen in the future we should make sure all implicit functions have explicit return types. (And all public functions should have explicit return types defined as well)

@SparkQA
Copy link

SparkQA commented Dec 4, 2014

Test build #24132 has started for PR 3580 at commit ee32fcd.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Dec 4, 2014

Test build #24132 has finished for PR 3580 at commit ee32fcd.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24132/
Test PASSed.

@zsxwing
Copy link
Member

zsxwing commented Dec 4, 2014

LGTM

@mateiz
Copy link
Contributor

mateiz commented Dec 4, 2014

Good catch on the return types. Would be great if we can make ScalaStyle complain about those.

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2014

I believe @adriaanm 's new linter can lint that (doesn't work in scalastyle right now).

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2014

Merging in master.

@asfgit asfgit closed this in ed92b47 Dec 5, 2014
@adriaanm
Copy link

adriaanm commented Dec 5, 2014

Yes, it's over at https://github.com/scala/scala-abide, by the way. (The fabled ImplicitMustNotHaveInferredType rule does not exist yet, though it would be easy to add.)

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2014

Thanks - any ETA on the 1st release?

@adriaanm
Copy link

adriaanm commented Dec 5, 2014

Not yet. You can already use it, though. We want to have a good number of rules before we start advertising it. In the phase where we're looking for partners to build out the platform / flesh out the rule set right now.

asfgit pushed a commit that referenced this pull request Dec 31, 2014
…Spark SQL

As we learned in #3580, not explicitly typing implicit functions can lead to compiler bugs and potentially unexpected runtime behavior.

Author: Reynold Xin <rxin@databricks.com>

Closes #3859 from rxin/sql-implicits and squashes the following commits:

30c2c24 [Reynold Xin] [SPARK-5038] Add explicit return type for implicit functions in Spark SQL.
asfgit pushed a commit that referenced this pull request Jan 1, 2015
As we learned in #3580, not explicitly typing implicit functions can lead to compiler bugs and potentially unexpected runtime behavior.

This is a follow up PR for rest of Spark (outside Spark SQL). The original PR for Spark SQL can be found at #3859

Author: Reynold Xin <rxin@databricks.com>

Closes #3860 from rxin/implicit and squashes the following commits:

73702f9 [Reynold Xin] [SPARK-5038] Add explicit return type for implicit functions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
7 participants