-
Notifications
You must be signed in to change notification settings - Fork 216
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python kernel is not initializing all PySpark modules (e.g. PySpark ML) #56
Comments
@lresende -- let me take a look at this. When I created the launcher script I tried to keep the initialization code to a minimum. There may be more we want to do here. |
took a quick look at the SystemML code which is (re-)raising the ImportError: https://github.com/apache/systemml/blob/master/src/main/python/systemml/mlcontext.py#L31 try:
import py4j.java_gateway
from py4j.java_gateway import JavaObject
from pyspark import SparkContext
import pyspark.mllib.common
except ImportError:
raise ImportError('Unable to import `pyspark`. Hint: Make sure you are running with PySpark.') @lresende -- is that the code you ran? the above imports work fine in my environment (without Anaconda) |
I am running this: With the addition of a initial line to install SystemML: And i only have anaconda on master, not on nodes where the iPython is installed |
@lresende -- can you still reproduce this issue? |
Properly updated Python and R kernelspec with required environment variables |
Running sample SystemML notebook
/opt/anaconda2/lib/python2.7/site-packages/systemml/mlcontext.py in ()
30 import pyspark.mllib.common
31 except ImportError:
---> 32 raise ImportError('Unable to import
pyspark
. Hint: Make sure you are running with PySpark.')33
The text was updated successfully, but these errors were encountered: