-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 🐛 Source Airtable: cast native Airtable Types to JSONSchema Types #21962
Conversation
/test connector=connectors/source-airtable
Build PassedTest summary info:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks legit to me - just a few questions in the review. TCS might want to reach out to users to let them know of the breaking change, in case they have pipelines consuming the data in string format - have you all aligned on any follow-up necessary?
airbyte-integrations/connectors/source-airtable/source_airtable/schema_helpers.py
Outdated
Show resolved
Hide resolved
airbyte-integrations/connectors/source-airtable/source_airtable/schema_helpers.py
Show resolved
Hide resolved
airbyte-integrations/connectors/source-airtable/integration_tests/expected_records.jsonl
Show resolved
Hide resolved
airbyte-integrations/connectors/source-airtable/acceptance-test-config.yml
Show resolved
Hide resolved
We could safely push this one into Master because:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
once the Cloud part is ready, we just unpin the version for source-airtable and have both OAuth + dynamic schema discovery + cast to JsonSchema types at the same time, and it will look like a single Major update for the Users to minimize the customer's effort.
Thank you for the thorough overview! This sounds like a good idea to bunch a lot of changes together
IMPORTANT: we would need to reach out to the existing Customers and announce that a big update is coming and they should be ready to Reset the data and Refresh the Schema.
👍🏻 please make sure TCS is aware of that on the original issue
/publish connector=connectors/source-airtable
if you have connectors that successfully published but failed definition generation, follow step 4 here |
/publish connector=connectors/source-airtable
if you have connectors that successfully published but failed definition generation, follow step 4 here |
@erica-airbyte The server-side changes for Airtable OAuth are deployed to the Cloud according to the latest release and Cloud version. Are we good to |
@bazarnov Yes, we need to reach out to customer if this change will cause their syncs to start failing. Will this change have any downstream affect to their destination data? (ie. stream name changing, field name changing, etc.) |
@erica-airbyte The change touches only the data types in stream schemas, the column names are still remain for the customers who are already updated to 1.0.1 and more, if updating from 0.1.3 - this will be completely breaking, and will require re-set up the connector. |
Ok, thanks @bazarnov we will send this outreach tomorrow to customers. Can we plan to have this update go out on Thursday? |
Comms have been sent. @davydov-d please deploy this tomorrow |
@erica-airbyte done |
@davydov-d I had a user that tried to refresh the source schema after this was deployed and encountered the following errors: |
@bazarnov as you were the one to work on this PR, are you aware of what could go wrong on the screenshots above? |
cc @bazarnov can you take a look at this real quick. Customer are not able to get around this. |
@pedroslopez as Airbyte OC are you able to take a look here? It looks like the latest update that went out this AM might have broken customer, is this an easy fix or should we roll this back? |
On a test workspace I changed the Auth Method to OAuth and gave access to all workspaces. Is there a chance that this update is not compatible with Users that are using API keys? |
@erica-airbyte @pedroslopez I've just tested using @TBernstein4 does the customer proceed with these instructions to create a new Standard Scopes required for the successfull authentication:
data.records
data.recordComments
schema.bases Also, could you please share the workspace link with this customer so we can examine the input config? |
I will ask them about the PAT. I will mention you in the oncall issue that has the effected workspace links. |
What
Resolving: https://github.com/airbytehq/alpha-beta-issues/issues/299
How
schema_helpers.py
to provide the functionality to cast data types automatically🚨 User Impact 🚨
@erica-airbyte
This is the BREAKING CHANGE.
But what should customer do? :
After the update, the customers should
Refresh Schema
and reset the data to pick up the correct schema types and avoid issues on the destination side. Unfortunately.Reason for the breaking change:
The reason for the breaking change is this one: https://github.com/airbytehq/alpha-beta-issues/issues/299
Pre-merge Checklist
Expand the relevant checklist and delete the others.
Updating a connector
Community member or Airbyter
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing/publish
command described here