You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Having already used jupyter pixiedust install to install Spark 2.0, I wanted the local installer to suggest a default location for Spark 1.6 that's not /Users/mbrobergus.ibm.com/pixiedust/bin/spark/spark-2.0.2-bin-hadoop2.7. The fix is to suggest /Users/mbrobergus.ibm.com/pixiedust/bin/spark and go from there. The same thing happens with the Scala version.
Actual behavior
local installer suggests /Users/mbrobergus.ibm.com/pixiedust/bin/spark/spark-2.0.2-bin-hadoop2.7 as default directory when I already have 2.0 installed.
Steps to reproduce the behavior
re-run jupyter pixiedust install to install a different kernel
The text was updated successfully, but these errors were encountered:
Expected behavior
Having already used
jupyter pixiedust install
to install Spark 2.0, I wanted the local installer to suggest a default location for Spark 1.6 that's not/Users/mbrobergus.ibm.com/pixiedust/bin/spark/spark-2.0.2-bin-hadoop2.7
. The fix is to suggest/Users/mbrobergus.ibm.com/pixiedust/bin/spark
and go from there. The same thing happens with the Scala version.Actual behavior
local installer suggests
/Users/mbrobergus.ibm.com/pixiedust/bin/spark/spark-2.0.2-bin-hadoop2.7
as default directory when I already have 2.0 installed.Steps to reproduce the behavior
re-run
jupyter pixiedust install
to install a different kernelThe text was updated successfully, but these errors were encountered: