Skip to content

Commit

Permalink
respond to post-merge code review for destination docs (#4418)
Browse files Browse the repository at this point in the history
  • Loading branch information
sherifnada committed Jun 29, 2021
1 parent 9212638 commit c2b3c23
Show file tree
Hide file tree
Showing 4 changed files with 13 additions and 7 deletions.
Expand Up @@ -40,7 +40,8 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
We use `JUnit` for Java tests.

### Unit and Integration Tests
Place unit tests under `src/test/io/airbyte/integrations/sources/{{snakeCase name}}`.
Place unit tests under `src/test/...`
Place integration tests in `src/test-integration/...`

#### Acceptance Tests
Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in
Expand Down
Expand Up @@ -40,7 +40,8 @@ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integrat
We use `JUnit` for Java tests.

### Unit and Integration Tests
Place unit tests under `src/test/io/airbyte/integrations/sources/scaffold_java_jdbc`.
Place unit tests under `src/test/...`
Place integration tests in `src/test-integration/...`

#### Acceptance Tests
Airbyte has a standard test suite that all source connectors must pass. Implement the `TODO`s in
Expand Down
Expand Up @@ -65,7 +65,7 @@ We recommend the following ways of iterating on your connector as you're making
* Test-driven development (TDD) using Airbyte's Acceptance Tests
* Directly running the docker image

**Test-driven development in Java**
#### Test-driven development in Java
This should feel like a standard flow for a Java developer: you make some code changes then run java tests against them. You can do this directly in your IDE, but you can also run all unit tests via Gradle by running the command to build the connector:

```
Expand All @@ -74,15 +74,15 @@ This should feel like a standard flow for a Java developer: you make some code c

This will build the code and run any unit tests. This approach is great when you are testing local behaviors and writing unit tests.

**TDD using acceptance tests & integration tests**
#### TDD using acceptance tests & integration tests

Airbyte provides a standard test suite (dubbed "Acceptance Tests") that runs against every destination connector. They are "free" baseline tests to ensure the basic functionality of the destination. When developing a connector, you can simply run the tests between each change and use the feedback to guide your development.

If you want to try out this approach, check out Step 6 which describes what you need to do to set up the acceptance Tests for your destination.

The nice thing about this approach is that you are running your destination exactly as Airbyte will run it in the CI. The downside is that the tests do not run very quickly. As such, we recommend this iteration approach only once you've implemented most of your connector and are in the finishing stages of implementation. Note that Acceptance Tests are required for every connector supported by Airbyte, so you should make sure to run them a couple of times while iterating to make sure your connector is compatible with Airbyte.

**Directly running the destination using Docker**
#### Directly running the destination using Docker

If you want to run your destination exactly as it will be run by Airbyte \(i.e. within a docker container\), you can use the following commands from the connector module directory \(`airbyte-integrations/connectors/destination-<name>`\):

Expand Down Expand Up @@ -141,7 +141,10 @@ docker run -v $(pwd)/secrets:/secrets --rm airbyte/destination-<name>:dev check
```

### Step 5: Implement `write`
The `write` operation is the main workhorse of a destination connector: it reads input data from the source and writes it to the underlying destination. It takes as input the config file used to run the connector as well as the configured catalog: the file used to describe the schema of the incoming data and how it should be written to the destination.
The `write` operation is the main workhorse of a destination connector: it reads input data from the source and writes it to the underlying destination. It takes as input the config file used to run the connector as well as the configured catalog: the file used to describe the schema of the incoming data and how it should be written to the destination. Its "output" is two things:

1. Data written to the underlying destination
2. `AirbyteMessage`s of type `AirbyteStateMessage`, written to stdout to indicate which records have been written so far during a sync. It's important to output these messages when possible in order to avoid re-extracting messages from the source. See the [write operation protocol reference](https://docs.airbyte.io/understanding-airbyte/airbyte-specification#write) for more information.

To implement the `write` Airbyte operation, implement the `getConsumer` method in your generated `<Name>Destination.java` file. Here are some example implementations from different destination conectors:

Expand All @@ -150,6 +153,7 @@ To implement the `write` Airbyte operation, implement the `getConsumer` method i
* [Local CSV](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/destination-csv/src/main/java/io/airbyte/integrations/destination/csv/CsvDestination.java#L90)
* [Postgres](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/destination-postgres/src/main/java/io/airbyte/integrations/destination/postgres/PostgresDestination.java)


{% hint style="info" %}
The Postgres destination leverages the `AbstractJdbcDestination` superclass which makes it extremely easy to create a destination for a database or data warehouse if it has a compatible JDBC driver. If the destination you are implementing has a JDBC driver, be sure to check out `AbstractJdbcDestination`.
{% endhint %}
Expand Down
2 changes: 1 addition & 1 deletion docs/understanding-airbyte/airbyte-specification.md
Expand Up @@ -209,7 +209,7 @@ For the sake of brevity, we will not re-describe `spec` and `check`. They are ex
2. `catalog` - An `AirbyteCatalog`. This `catalog` should be a subset of the `catalog` returned by the `discover` command. Any `AirbyteRecordMessages`s that the destination receives that do _not_ match the structure described in the `catalog` will fail.
3. `message stream` - \(this stream is consumed on stdin--it is not passed as an arg\). It will receive a stream of JSON-serialized `AirbyteMesssage`.
* Output:
1. none.
1. `AirbyteMessage`s of type `AirbyteStateMessage`. The destination connector should only output state messages if they were previously received as input on stdin. Outputting a state message indicates that all records which came before it have been successfully written to the destination.
* The destination should read in the `AirbyteMessages` and write any that are of type `AirbyteRecordMessage` to the underlying data store.
* The destination should fail if any of the messages it receives do not match the structure described in the `catalog`.

Expand Down

0 comments on commit c2b3c23

Please sign in to comment.