You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From what I can see, this is because the session.sparkContext attribute is trying to be accessed within pyspark.Backed.do_connect
I have tested this by commenting out this call and can successfully create a connection. I can't see any place where this context is specifically used, so I'm not sure if it's strictly required
What version of ibis are you using?
9.0.0
What backend(s) are you using, if any?
PySpark
Relevant log output
File "/home/****/lib/python3.11/site-packages/IPython/core/interactiveshell.py", line 3550, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-2-6ab88d163e5c>", line 8, in<module>
con = ibis.pyspark.connect(spark)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/****/lib/python3.11/site-packages/ibis/__init__.py", line 108, in connect
return backend.connect(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/****/lib/python3.11/site-packages/ibis/backends/__init__.py", line 848, in connect
new_backend.reconnect()
File "/home/****/lib/python3.11/site-packages/ibis/backends/__init__.py", line 862, in reconnect
self.do_connect(*self._con_args, **self._con_kwargs)
File "/home/****/lib/python3.11/site-packages/ibis/backends/pyspark/__init__.py", line 197, in do_connect
self._context = session.sparkContext
^^^^^^^^^^^^^^^^^^^^
File "/home/****/lib/python3.11/site-packages/pyspark/sql/connect/session.py", line 772, in __getattr__
raise PySparkAttributeError(
pyspark.errors.exceptions.base.PySparkAttributeError: [JVM_ATTRIBUTE_NOT_SUPPORTED] Attribute `sparkContext` is not supported in Spark Connect as it depends on the JVM. If you need to use this attribute, do not use Spark Connect when creating your session. Visit https://spark.apache.org/docs/latest/sql-getting-started.html#starting-point-sparksession forcreating regular Spark Sessionin detail.
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
What happened?
I am unable to create a connection using
ibis.pyspark.connectusing a remote spark session created usingDatabricksSessionFrom what I can see, this is because the
session.sparkContextattribute is trying to be accessed withinpyspark.Backed.do_connectI have tested this by commenting out this call and can successfully create a connection. I can't see any place where this context is specifically used, so I'm not sure if it's strictly required
What version of ibis are you using?
9.0.0
What backend(s) are you using, if any?
PySpark
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: