You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Getting the error spark.sql.mapKeyDedupPolicy is not supported by Databricks SQL Warehouses when using ibis pyspark with a Databricks SQL Warehouse Cluster.
Thanks for opening this @ArtnerC! I think we'd happily accept a PR with the fix you suggest if you're interested in submitting one. I think we can skip checking for a specific exception and just ignore any exceptions on that line:
try:
spark.conf.set("spark.sql.mapKeyDedupPolicy", "LAST_WIN")
exceptException: # is there a specific exception class we could catch instead of just `Exception`?pass
Re: general databricks support, we don't have any immediate plans to set up a databricks testing environment (or a databricks-specific backend if needed), but if it's possible to make things work with just our existing pyspark backend, we'd happily continue to accept bugfixes towards making that work.
Getting the error
spark.sql.mapKeyDedupPolicy
is not supported by Databricks SQL Warehouses when using ibis pyspark with a Databricks SQL Warehouse Cluster.See: https://community.databricks.com/t5/data-engineering/spark-settings-in-sql-warehouse/td-p/7959
Set in do_connnect:
https://github.com/ibis-project/ibis/blame/e425ad57899f8ebbea29b57bb53cedb40ebd7193/ibis/backends/pyspark/__init__.py#L180
Workaround could be as simple as:
but I'm not sure what other approaches there might be.
The text was updated successfully, but these errors were encountered: