Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we expose REST API on detla tables? #22

Closed
pawankumarshukla opened this issue May 1, 2019 · 5 comments
Closed

Can we expose REST API on detla tables? #22

pawankumarshukla opened this issue May 1, 2019 · 5 comments
Labels
question Questions on how to use Delta Lake

Comments

@pawankumarshukla
Copy link

For provide data to down stream, we are using REST API on NoSQL db. Since detla table is supporting ACID, so there is a view to replace NoSQL db with delta table. Can we point existing REST API to delta table?

@tdas
Copy link
Contributor

tdas commented May 1, 2019

We dont have any support for that right now. The only API we have write now is Spark DataFrame Reader Writer APIs, and we will have SQL support that will be executable through Spark SQL.

It's important to understand that Delta Lake is a data layout format. It does not run a service or a process, so the question of any API endpoint does not arise.

@tdas tdas added the question Questions on how to use Delta Lake label May 1, 2019
@tdas
Copy link
Contributor

tdas commented May 11, 2019

I am closing this issue. Please reopen it if you have any follow up questions.

@tdas tdas closed this as completed May 11, 2019
LantaoJin added a commit to LantaoJin/delta that referenced this issue Mar 24, 2020
@ReubenTheDS
Copy link

ReubenTheDS commented Dec 3, 2020

@tdas A year later, but I have a question in the same "zone" as the OP. I understand "that Delta Lake is a data layout format" (quoted above). Is there now a standard for "exposing" the large Delta Lake tables(we intend using HDFS for storage) - reason we'd like to run visualisations on Delta Lake tables created by Spark scripts, rather than:
querying HDFS -> porting Delta Lake table snippet to Spark Parquet -> writing temp parquet data-> visualise using a 3rd party platform -> delete temp Parquet data

Or is there another way you suggest we go about this? Thanks!

LantaoJin added a commit to LantaoJin/delta that referenced this issue Mar 12, 2021
@rjurney
Copy link

rjurney commented Sep 28, 2021

This would be really helpful and really powerful. We could store schemas in one place - Delta Tables - rather than more than one place. We just need to be able to fetch the schema from a Delta table to do this :(

@zsxwing
Copy link
Member

zsxwing commented Sep 28, 2021

@rjurney Delta Lake is just a table format, similar to Parquet. It doesn't have any service. If you would like to read Delta tables through REST APIs, you can try Delta Sharing.

tdas pushed a commit to tdas/delta that referenced this issue May 31, 2023
Add build support for Scala 2.11.

Closes delta-io#18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Questions on how to use Delta Lake
Projects
None yet
Development

No branches or pull requests

5 participants