New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-23667][CORE] Better scala version check #20809
Conversation
For the case, shouldn't we just set |
@viirya Yes, but this is only for people who will investigate on Spark code, and it also requires manual efforts. Isn't it better if we get this automatically? |
Can you provide more information in the bug report? e.g. a sample application and a sample error. I don't think this is the correct change, but without your use case I'm not sure what the right change would be. |
@vanzin Thanks. : )
|
That sounds a little odd. If that is true, then your proposed code wouldn't work either, since it requires SPARK_HOME to be known. In any case, there are two calls to First is:
And your code shouldn't be triggering that, since both env variables are for Spark development and other applications shouldn't be using them. Second call is a little later:
Here |
Do you plan to update this PR? Otherwise it should be closed. |
@vanzin Sorry but I will update it in next week, thanks. |
@vanzin Sorry for the late reply. According to the call stack, it's the first place that called |
I don't understand your reply. The testing stuff should only be true during Spark unit tests. You shouldn't be setting that in your tests because you're not testing Spark. If you are, you should fix your testing infrastructure to not do that. |
Can one of the admins verify this patch? |
What changes were proposed in this pull request?
In some cases when outer project use pre-built Spark as dependency,
getScalaVersion
will fail due tolauncher
directory doesn't exist. This PR also checks injars
directory.How was this patch tested?
Existing tests.