New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-6673] spark-shell.cmd can't start in Windows even when spark was built #5328
Conversation
…as built added equivalent script to load-spark-env.sh
Test build #29593 has started for PR 5328 at commit |
Test build #29593 has finished for PR 5328 at commit
|
Test PASSed. |
CC @vanzin; I tend to trust your judgment about this and it does seem like Windows should have the same script. Does this only affect |
This problem is introduced by e3eb393 |
Ah, I see that #5085 broke this in master. The previous code detected the scala version automatically, although in a different way from this patch. I tend to prefer the code that was there before (simpler), but don't have strong feelings either way. |
@tsudukim up to you on whether you want to merge this or a different version. I'll merge whatever you see fit tomorrow since this one's important to fix. |
@srowen @andrewor14 @vanzin - if this was caused by #5085, then does it affect Spark 1.3? |
Agree, I think this is only for |
Okay - let's see if we can figure that out and hopefully update the affects and target version on the JIRA to be clear this is unrelated to 1.3. |
@srowen This version is OK to merge. |
added equivalent script to load-spark-env.sh