New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unnest() fails on Databricks #3267
Comments
Hi, I wonder if after closing a connection, and reconnecting, the ability to create Temp Views is restricted. Can you try creating a Temp View manually using
|
This does work fine. I was doing some debugging, and it seems like the failure occurs in the call to sdf_register where the invoke of createOrReplaceView isn't actually creating the view (invoked from sdf_fast_bind_cols). Why this is however I cannot figure out yet. |
Hi @tschutte , is this something that still giving you problems in later versions of Databricks clusters? |
Yes, this is still occurring with sparklyr 1.8.1 and DBR 12.2. |
I have exactly the same issue, still with DBR 14.3 and sparklyr 1.8.1, it would be really helpful if that would work! |
I have an interesting issue running sparklyr::unnest() inside of Databricks. The first time I use the function after a cluster restart, the function works perfectly. However, if I either restart my R session when using RStudio, or if I detach/reattach a notebook, the function will fail claiming it cannot locate a tmp table.
This is reproducible using the default sparklyr 1.7.5 provided by Databricks and also if I upgrade to 1.7.7. I am also using Databricks 10.4.
I assume the error is coming out of the creation of the temporary table near the bottom of this function, but it is not apparent to me what would cause this error, especially only upon a restart of the RSession.
Here is an easily reproducible example using a simple test case.
Let me know if I can provide any other info to help troubleshoot.
Error
Session Info
The text was updated successfully, but these errors were encountered: