-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove the hard dependency on the pyspark #6
Comments
@alexott - thanks for reporting this & sorry for the delayed response. I wasn't able to replicate your issue (was able to successfully attach chispa v0.6.0 to a Databricks cluster via PyPi & manually attaching the wheel), but I'm a Python n00b and I'm sure your point is valid. In the Scala world, we add the Spark dependency like I updated PySpark to be a "dev-dependency" rather than a regular dependency:
I published chispa v0.7.0. Can you please try it out and let me know if it fixes your issue? Thanks! |
Thank you very much Matthew! I'll check on Monday, when I get to my work laptop... |
Thank you! I just checked, it works just fine now |
@alexott - thanks for confirming! If you ever have any additional recommendations for this library, just let me know, thanks! |
Right now, the
chispa
package has a hard dependency on thepyspark
making it hard to use with Databricks runtime, or other compatible Spark runtime. Instead, this package should either rely on implicit dependency completely, or use something like findspark package, something like done in spark-testing-base or in pytest-sparkThe text was updated successfully, but these errors were encountered: