-
Notifications
You must be signed in to change notification settings - Fork 175
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[destinations] Databricks #762
Comments
@AstrakhantsevaAA let's sync on slack as well. we should invite the implementer as contributor so it is easy to run our CI. |
@AstrakhantsevaAA, this looks great. Under "Support data sources", does this mean that S3 and ADLS would only be supported for the staging part? Or would this also be available for the destination dataset as well? We would likely want the option to have the destination dataset be created as an "external table". Also, Auto-Loader for Databricks SQL is currently in public preview (see here). I typically use COPY INTO, but having the option for autoloader would be nice to have. |
@phillem15 yes, I meant staging data sources. User can use filesystem destination to load files directly to s3, azure, gs storage. Or use filesystem verified source as a source to load files from buckets to the Databricks workspace. |
@phillem15 are people using databricks file system like they use s3 for athena? actually we could support it in our |
Is this being worked on? I am happy to contribute if required? |
@phillem15 works on it, I'll check the development status and if it turns out that we need help, I'll get back to you. Thank you for your interest! |
Feature description
Implement Databricks destination. It will be quite similar to the
Snowflake
implementation.Authorization:
Support data sources (staging storage):
Tests
The text was updated successfully, but these errors were encountered: