New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support TIMESTAMP AS OF
, VERSION AS OF
in SQL
#128
Comments
Hi @spmp , The functionality is currently available. We don't have custom SQL API support because that depends on changes in Spark, but there are Scala APIs to do all of those. Documentation for time travel options is available at: Documentation for the Scala APIs for DESCRIBE HISTORY is available at: Thanks, and please let us know if you have further questions. |
(sorry for late response, I received no email notification...) |
You're right, there's definitely some interaction with Spark needed. It's hard to give a good estimate for this because it depends on the Spark 3.0 release (which can't really be scheduled, since it's a community effort with lots of different voices), but we're hoping for sometime this fall. I'll reopen this issue (and add SQL to the title) so we can use it to track SQL APIs for these features. |
TIMESTAMP AS OF
, VERSION AS OF
, and DESCRIBE HISTORY
TIMESTAMP AS OF
, VERSION AS OF
, and DESCRIBE HISTORY
in SQL
Hi guys, I'm reading this and it seems that SQL support was integrated in delta. I tried this simple code snippet and it seems to work def showDeltaTableHistoryViaSQL() = {
spark.sql(s"DESCRIBE HISTORY '${mydata.tbAccountsPath}'").show(false)
} Did I understand well? Can this issue be closed? |
Yes, we did add support for simple SQL commands like Since we have already added Incidentally, support for MERGE SQL command will come with 0.7.0 release (next release is 0.6.0 in March) after Spark 3.0 is released |
TIMESTAMP AS OF
, VERSION AS OF
, and DESCRIBE HISTORY
in SQLTIMESTAMP AS OF
, VERSION AS OF
in SQL
Nice! Thank you very much for your support and your answer! |
Hi @tdas , when will this feature be available ? I'm looking for it. Thanks. |
Thanks @brucemen711 for bumping this. I have looked again in 2021 at https://docs.delta.io/latest/delta-utility.html#history and seen that there is (AFAIK) still no SQL interface to |
I agree that this feature depends on changes in Spark. It involves adding keywords, overriding some visit functions in AstBuilder, and other UnResolved LogicalPlans for time-travel. |
@YannByron Feel free to raise a Spark issue to ask for the time travel SQL syntax support. |
|
I am looking forward to it, @YannByron please keep us in the loop for testing 8) Thanks |
Since this currently doesn't work in SQL, can we remove the SQL examples from the public docs for now? https://docs.delta.io/1.0.0/delta-batch.html#syntax |
@AFFogarty Good call. We will review the doc and update the examples. |
Kia ora All, If so, should we call this out and close this issue? The use case I really have in mind is that of a non technical user accessing a Delta table via Hive and hence I am looking for SQL interfaces to these commands such that the non technical user or downstream processes can access them. |
+1 on this. It's very confusing to look at these docs and then wonder why it doesn't work in practice. :) |
Great call out @nchammas - this should be fixed for the next documentation release. Will keep this open until this is resolved. Thanks! |
I hit the same issue today, could you update the official document ? https://docs.delta.io/latest/delta-batch.html#query-an-older-snapshot-of-a-table-time-travel |
We are working on the fix for the doc. It will be out soon. |
Closing this as support was added in #1288 and is being released in 2.1 |
Please add support for the time travel functions
TIMESTAMP AS OF
,VERSION AS OF
, and.DESCRIBE HISTORY
Time travel is a critical Delta use.
Cheers.
--
Updated by @zsxwing : Spark has added the SQL syntax support in https://issues.apache.org/jira/browse/SPARK-37219 . This will be supported when Delta supports Spark 3.3 (#1217).
The text was updated successfully, but these errors were encountered: