Skip to content

Commit

Permalink
connectors-ci: deprecate slash publish (#25865)
Browse files Browse the repository at this point in the history
  • Loading branch information
alafanechere committed May 22, 2023
1 parent 8bfbef2 commit 80032f7
Show file tree
Hide file tree
Showing 15 changed files with 489 additions and 475 deletions.
3 changes: 1 addition & 2 deletions .github/workflows/gradle.yml
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,6 @@ jobs:
attempt_limit: 3
attempt_delay: 5000 # in ms


# Connectors Base
# In case of self-hosted EC2 errors, remove this block.
start-connectors-base-build-runner:
Expand Down Expand Up @@ -272,7 +271,7 @@ jobs:
run: python3 -m pip install virtualenv==16.7.9 --user

- name: Install automake
run: apt-get install -y automake build-essential libtool libtool-bin autoconf
run: apt-get update && apt-get install -y automake build-essential libtool libtool-bin autoconf

- name: Set up CI Gradle Properties
run: |
Expand Down
448 changes: 448 additions & 0 deletions .github/workflows/legacy-publish-command.yml

Large diffs are not rendered by default.

418 changes: 9 additions & 409 deletions .github/workflows/publish-command.yml

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions .github/workflows/publish_connectors.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: (WIP) Publish connectors
name: Publish connectors on merge to master

on:
push:
Expand Down Expand Up @@ -49,7 +49,7 @@ jobs:
# Setting concurrency to 1 for safety:
# High concurrency can lead to resource issues for java connectors.
# As speed is not a concern in this context I think not publishing connectors in parallel is fine.
subcommand: "connectors --concurrency=1 --execute-timeout=3600 --modified publish --pre-release"
subcommand: "connectors --concurrency=1 --execute-timeout=3600 --modified publish --main-release"
context: "master"
- name: Publish connectors [manual]
id: publish-connectors
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/slash-commands.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ jobs:
build-connector
publish-connector
publish
legacy-publish
publish-external
gke-kube-test
run-specific-test
Expand Down
10 changes: 3 additions & 7 deletions .github/workflows/upload-metadata-files.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,7 @@
name: Upload any Changed Metadata Files
name: "Upload any Changed Metadata Files [Exceptional Use!]"

on:
push:
branches:
- master
paths:
- "airbyte-integrations/connectors/**/metadata.yaml"
workflow_dispatch:

jobs:
deploy-catalog-to-stage:
Expand All @@ -23,7 +19,7 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true'
uses: actions/setup-python@v4
with:
python-version: '3.10'
python-version: "3.10"
- name: Install metadata_service
if: steps.changed-files.outputs.any_changed == 'true'
run: pip install airbyte-ci/connectors/metadata_service/lib
Expand Down
3 changes: 1 addition & 2 deletions airbyte-ci/connectors/CONNECTOR_CHECKLIST.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,4 @@ paths:
- All documentation files are up to date. (README.md, bootstrap.md, docs.md, etc...)
- Changelog updated in `docs/integrations/<source or destination>/<name>.md` with an entry for the new version. See changelog [example](https://docs.airbyte.io/integrations/sources/stripe#changelog)
- You, or an Airbyter, have run `/test` successfully on this PR - or on a non-forked branch
- You, or an Airbyter, have run `/publish` successfully on this PR - or on a non-forked branch
- You've updated the connector's [metadata.yaml](https://docs.airbyte.com/connector-development/connector-metadata-file) file `new!`
- You've updated the connector's `metadata.yaml` file (new!)

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Our SSH connector support is designed to be easy to plug into any existing conne
Replace port_key and host_key as necessary. Look at `transform_postgres()` to see an example.
2. To make sure your changes are present in Normalization when running tests on the connector locally, you'll need to change [this version tag](https://github.com/airbytehq/airbyte/blob/6d9ba022646441c7f298ca4dcaa3df59b9a19fbb/airbyte-workers/src/main/java/io/airbyte/workers/normalization/DefaultNormalizationRunner.java#L50) to `dev` so that the new locally built docker image for Normalization is used. Don't push this change with the PR though.
3. If your `host_key="host"` and `port_key="port"` then this step is not necessary. However if the key names differ for your connector, you will also need to add some logic into `sshtunneling.sh` (within airbyte-workers) to handle this, as currently it assumes that the keys are exactly `host` and `port`.
4. When making your PR, make sure that you've version bumped Normalization (in `airbyte-workers/src/main/java/io/airbyte/workers/normalization/DefaultNormalizationRunner.java` and `airbyte-integrations/bases/base-normalization/Dockerfile`). You'll need to /test & /publish Normalization _first_ so that when you /test the connector, it can use the new version.
4. When making your PR, make sure that you've version bumped Normalization (in `airbyte-workers/src/main/java/io/airbyte/workers/normalization/DefaultNormalizationRunner.java` and `airbyte-integrations/bases/base-normalization/Dockerfile`). You'll need to /test & /legacy-publish Normalization _first_ so that when you /test the connector, it can use the new version.

## Misc

Expand Down
7 changes: 7 additions & 0 deletions airbyte-integrations/bases/base-normalization/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -452,3 +452,10 @@ For more details and options, you can also refer to the [testing connectors docs
### Acceptance Tests

Please refer to the [developing docs](../../../docs/contributing-to-airbyte/developing-locally.md) on how to run Acceptance Tests.

## Publishing normalization
The normalization publish pipeline still relies on the `manage.sh` [script](https://github.com/airbytehq/airbyte/blob/master/tools/integrations/manage.sh). It is not published on merge to master, but rather on demand, from the PR. To publish normalization, run the following slash command on the PR:

```text
/legacy-publish connector=bases/base-normalization
```
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,6 @@ To do so, from the root of the `airbyte` repo, run `./airbyte-cdk/python/bin/run
* When running `connectorAcceptanceTest` `gradle` task
* When running or `./acceptance-test-docker.sh` in a connector project
* When running `/test` command on a GitHub pull request.
* When running `/publish` command on a GitHub pull request.
* When running ` integration-test` GitHub action. This is the same action that creates and uploads the test report JSON files that power the badges in the [connector registry summary report](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html).

## Developing on the acceptance tests
Expand Down
34 changes: 10 additions & 24 deletions docs/connector-development/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,46 +140,32 @@ The presence of these fields will enable normalization for the connector, and de

Once you've finished iterating on the changes to a connector as specified in its `README.md`, follow these instructions to ship the new version of the connector with Airbyte out of the box.

1. Bump the version in the `Dockerfile` of the connector \(`LABEL io.airbyte.version=X.X.X`\).
1. Update the docker image version in the [metadata.yaml](connector-metadata-file.md) of the connector.
1. Submit a PR containing the changes you made.
1. One of Airbyte maintainers will review the change and publish the new version of the connector to Docker hub. Triggering tests and publishing connectors can be done by leaving a comment on the PR with the following format \(the PR must be from the Airbyte repo, not a fork\):

1. Bump the version in the `Dockerfile` of the connector \(`LABEL io.airbyte.version=X.X.X`\).
2. Bump the docker image version in the [metadata.yaml](connector-metadata-file.md) of the connector.
3. Submit a PR containing the changes you made.
4. One of Airbyte maintainers will review the change in the new version. Triggering tests can be done by leaving a comment on the PR with the following format \(the PR must be from the Airbyte repo, not a fork\):
```text
# to run integration tests for the connector
# Example: /test connector=connectors/source-hubspot
/test connector=(connectors|bases)/<connector_name>
# to run integration tests, publish the connector, and use the updated connector version in our config/metadata files
# Example: /publish connector=connectors/source-hubspot
/publish connector=(connectors|bases)/<connector_name>
/test connector=(connectors|bases)/<connector_name>
```
1. The new version of the connector is now available for everyone who uses it. Thank you!

5. You our an Airbyte maintainer can merge the PR once it is approved and all the required CI checks are passing you.
6. Once the PR is merged the new connector version will be published to DockerHub and the connector should now be available for everyone who uses it. Thank you!

### Updating Connector Metadata

When a new (or updated version) of a connector is ready to be published, our automations will check your branch for a few things:
When a new (or updated version) of a connector is ready, our automations will check your branch for a few things:
* Does the connector have an icon?
* Does the connector have documentation and is it in the proper format?
* Does the connector have a changelog entry for this version?
* The [metadata.yaml](connector-metadata-file.md) file is valid.

If any of the above are failing, you won't be able to merge your PR or publish your connector.

Connector icons should be square SVGs and be located in [this directory](https://github.com/airbytehq/airbyte/tree/master/airbyte-config-oss/init-oss/src/main/resources/icons).

Connector documentation and changelogs are markdown files which live either [here for sources](https://github.com/airbytehq/airbyte/tree/master/docs/integrations/sources), or [here for destinations](https://github.com/airbytehq/airbyte/tree/master/docs/integrations/destinations).

The [metadata.yaml](connector-metadata-file.md) file is valid.

### The /publish command
Connector documentation and changelogs are markdown files living either [here for sources](https://github.com/airbytehq/airbyte/tree/master/docs/integrations/sources), or [here for destinations](https://github.com/airbytehq/airbyte/tree/master/docs/integrations/destinations).

Publishing a connector can be done using the `/publish` command as outlined in the above section. The command runs a [github workflow](https://github.com/airbytehq/airbyte/actions/workflows/publish-command.yml), and has the following configurable parameters:
* **connector** - Required. This tells the workflow which connector to publish. e.g. `connector=connectors/source-amazon-ads`. This can also be a comma-separated list of many connectors, e.g. `connector=connectors/source-s3,connectors/destination-postgres,connectors/source-facebook-marketing`. See the parallel flag below if publishing multiple connectors.
* **repo** - Defaults to the main airbyte repo. Set this when building connectors from forked repos. e.g. `repo=userfork/airbyte`
* **gitref** - Defaults to the branch of the PR where the /publish command is run as a comment. If running manually, set this to your branch where you made changes e.g. `gitref=george/s3-update`
* **comment-id** - This is automatically filled if you run /publish from a comment and enables the workflow to write back success/fail logs to the git comment.
* **parallel** - Defaults to false. If set to true, a pool of runner agents will be spun up to allow publishing multiple connectors in parallel. Only switch this to true if publishing multiple connectors at once to avoid wasting $$$.

## Using credentials in CI

Expand Down
14 changes: 2 additions & 12 deletions docs/connector-development/testing-connectors/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Using the following tools:
3. isort
4. mypy

Airbyte CI/CD workflows use them during "test/publish" commands obligatorily.
Airbyte CI/CD workflows use them during "test" commands obligatorily.
All their settings are aggregated into the single file `pyproject.toml` into Airbyte project root.
Locally all these tools can be launched by the following gradle command:
```
Expand Down Expand Up @@ -53,18 +53,8 @@ Once the integration test workflow launches, it will append a link to the workfl

Integration tests can also be manually requested by clicking "[Run workflow](https://github.com/airbytehq/airbyte/actions?query=workflow%3Aintegration-test)" and specifying the connector and GitHub ref.

### 4. Requesting GitHub PR publishing Docker Images

In order for users to reference the new versions of a connector, it needs to be published and available in the [dockerhub](https://hub.docker.com/r/airbyte/source-sendgrid/tags?page=1&ordering=last_updated) with the latest tag updated.

As seen previously, GitHub workflow can be triggered by comment submission. Publishing docker images to the dockerhub repository can also be submitted likewise:

Note that integration tests can be triggered with a slightly different syntax for arguments. This second set is required to distinguish between `connectors` and `bases` folders. Thus, it is also easier to switch between the `/test` and `/publish` commands:

* `/test connector=connectors/source-sendgrid` - Runs integration tests for a single connector on the latest PR commit.
* `/publish connector=connectors/source-sendgrid` - Publish the docker image if it doesn't exist for a single connector on the latest PR commit.

### 5. Automatically Run From `master`
### 4. Automatically Run From `master`

Commits to `master` attempt to launch integration tests. Two workflows launch for each commit: one is a launcher for integration tests, the other is the core build \(the same as the default for PR and branch builds\).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,18 +83,8 @@ Once the integration test workflow launches, it will append a link to the workfl

Integration tests can also be manually requested by clicking "[Run workflow](https://github.com/airbytehq/airbyte/actions?query=workflow%3Aintegration-test)" and specifying the connector and GitHub ref.

### 3. Requesting GitHub PR publishing Docker Images

In order for users to reference the new versions of a connector, it needs to be published and available in the [dockerhub](https://hub.docker.com/r/airbyte/source-sendgrid/tags?page=1&ordering=last_updated) with the latest tag updated.

As seen previously, GitHub workflow can be triggered by comment submission. Publishing docker images to the dockerhub repository can also be submitted likewise:

Note that integration tests can be triggered with a slightly different syntax for arguments. This second set is required to distinguish between `connectors` and `bases` folders. Thus, it is also easier to switch between the `/test` and `/publish` commands:

* `/test connector=connectors/source-sendgrid` - Runs integration tests for a single connector on the latest PR commit.
* `/publish connector=connectors/source-sendgrid` - Publish the docker image if it doesn't exist for a single connector on the latest PR commit.

### 4. Automatically Run From `master`
### 3. Automatically Run From `master`

Commits to `master` attempt to launch integration tests. Two workflows launch for each commit: one is a launcher for integration tests, the other is the core build \(the same as the default for PR and branch builds\).

Expand Down
5 changes: 2 additions & 3 deletions tools/ci_connector_ops/ci_connector_ops/pipelines/contexts.py
Original file line number Diff line number Diff line change
Expand Up @@ -518,7 +518,6 @@ def create_slack_message(self) -> str:
message += f" {self.state.value['description']}\n"
if self.state is ContextState.SUCCESSFUL:
message += f"⏲️ Run duration: {round(self.report.run_duration)}s\n"
# TODO: renable this when pipeline is stable
# if self.state is ContextState.FAILURE:
# message += "\ncc. <!channel>"
if self.state is ContextState.FAILURE:
message += "\ncc. <!channel>"
return message

0 comments on commit 80032f7

Please sign in to comment.