Skip to content

Commit

Permalink
Revert "airbyte-ci: enable architecture selection on connector build (#…
Browse files Browse the repository at this point in the history
…32816)"

This reverts commit 965ed21.
  • Loading branch information
bnchrch committed Nov 29, 2023
1 parent a8e8a34 commit 2bde54a
Show file tree
Hide file tree
Showing 20 changed files with 95 additions and 294 deletions.
5 changes: 0 additions & 5 deletions .github/workflows/publish_connectors.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,6 @@ on:
type: string
default: ci-runner-connector-publish-large-dagger-0-6-4
required: true
airbyte-ci-binary-url:
description: "URL to airbyte-ci binary"
required: false
default: https://connectors.airbyte.com/airbyte-ci/releases/ubuntu/latest/airbyte-ci
jobs:
publish_connectors:
name: Publish connectors
Expand Down Expand Up @@ -66,7 +62,6 @@ jobs:
s3_build_cache_access_key_id: ${{ secrets.SELF_RUNNER_AWS_ACCESS_KEY_ID }}
s3_build_cache_secret_key: ${{ secrets.SELF_RUNNER_AWS_SECRET_ACCESS_KEY }}
subcommand: "connectors ${{ github.event.inputs.connectors-options }} publish ${{ github.event.inputs.publish-options }}"
airbyte_ci_binary_url: ${{ github.event.inputs.airbyte-ci-binary-url }}

set-instatus-incident-on-failure:
name: Create Instatus Incident on Failure
Expand Down
35 changes: 13 additions & 22 deletions airbyte-ci/connectors/pipelines/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,17 +124,18 @@ At this point you can run `airbyte-ci` commands.

#### Options

| Option | Default value | Mapped environment variable | Description |
| ------------------------------------------ | -------------------------------------------------------------------------------------------------------------------------------- | ----------------------------- | ------------------------------------------------------------------------------------------- |
| `--enable-dagger-run/--disable-dagger-run` | `--enable-dagger-run`` | | Disables the Dagger terminal UI. | | | | |
| `--is-local/--is-ci` | `--is-local` | | Determines the environment in which the CLI runs: local environment or CI environment. |
| `--git-branch` | The checked out git branch name | `CI_GIT_BRANCH` | The git branch on which the pipelines will run. |
| `--git-revision` | The current branch head | `CI_GIT_REVISION` | The commit hash on which the pipelines will run. |
| `--diffed-branch` | `origin/master` | | Branch to which the git diff will happen to detect new or modified files. |
| `--gha-workflow-run-id` | | | GHA CI only - The run id of the GitHub action workflow |
| `--ci-context` | `manual` | | The current CI context: `manual` for manual run, `pull_request`, `nightly_builds`, `master` |
| `--pipeline-start-timestamp` | Current epoch time | `CI_PIPELINE_START_TIMESTAMP` | Start time of the pipeline as epoch time. Used for pipeline run duration computation. |
| `--show-dagger-logs/--hide-dagger-logs` | `--hide-dagger-logs` | | Flag to show or hide the dagger logs. |
| Option | Default value | Mapped environment variable | Description |
| ------------------------------------------ | ---------------------------------------------------------------------------------------------- | ----------------------------- | ------------------------------------------------------------------------------------------- |
| `--enable-dagger-run/--disable-dagger-run` | `--enable-dagger-run`` | | Disables the Dagger terminal UI. | | |
| `--enable-auto-update/--disable-auto-update` | `--enable-auto-update`` | | Disables the auto update prompt | | |
| `--is-local/--is-ci` | `--is-local` | | Determines the environment in which the CLI runs: local environment or CI environment. |
| `--git-branch` | The checked out git branch name | `CI_GIT_BRANCH` | The git branch on which the pipelines will run. |
| `--git-revision` | The current branch head | `CI_GIT_REVISION` | The commit hash on which the pipelines will run. |
| `--diffed-branch` | `origin/master` | | Branch to which the git diff will happen to detect new or modified files. |
| `--gha-workflow-run-id` | | | GHA CI only - The run id of the GitHub action workflow |
| `--ci-context` | `manual` | | The current CI context: `manual` for manual run, `pull_request`, `nightly_builds`, `master` |
| `--pipeline-start-timestamp` | Current epoch time | `CI_PIPELINE_START_TIMESTAMP` | Start time of the pipeline as epoch time. Used for pipeline run duration computation. |
| `--show-dagger-logs/--hide-dagger-logs` | `--hide-dagger-logs` | | Flag to show or hide the dagger logs. |

### <a id="connectors-command-subgroup"></a>`connectors` command subgroup

Expand Down Expand Up @@ -254,9 +255,6 @@ It's mainly purposed for local use.
Build a single connector:
`airbyte-ci connectors --name=source-pokeapi build`

Build a single connector for multiple architectures:
`airbyte-ci connectors --name=source-pokeapi build --architecture=linux/amd64 --architecture=linux/arm64`

Build multiple connectors:
`airbyte-ci connectors --name=source-pokeapi --name=source-bigquery build`

Expand Down Expand Up @@ -293,17 +291,11 @@ flowchart TD
distTar-->connector
normalization--"if supports normalization"-->connector
load[Load to docker host with :dev tag]
load[Load to docker host with :dev tag, current platform]
spec[Get spec]
connector-->spec--"if success"-->load
```

### Options

| Option | Multiple | Default value | Description |
| --------------------- | -------- | -------------- | ----------------------------------------------------------------- |
| `--architecture`/`-a` | True | Local platform | Defines for which architecture the connector image will be built. |

### <a id="connectors-publish-command"></a>`connectors publish` command
Run a publish pipeline for one or multiple connectors.
It's mainly purposed for CI use to release a connector update.
Expand Down Expand Up @@ -442,7 +434,6 @@ This command runs the Python tests for a airbyte-ci poetry package.
## Changelog
| Version | PR | Description |
| ------- | ---------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- |
| 2.9.0 | [#32816](https://github.com/airbytehq/airbyte/pull/32816) | Add `--architecture` option to connector build. |
| 2.8.0 | [#31930](https://github.com/airbytehq/airbyte/pull/31930) | Move pipx install to `airbyte-ci-dev`, and add auto-update feature targeting binary |
| 2.7.3 | [#32847](https://github.com/airbytehq/airbyte/pull/32847) | Improve --modified behaviour for pull requests. |
| 2.7.2 | [#32839](https://github.com/airbytehq/airbyte/pull/32839) | Revert changes in v2.7.1. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,11 @@
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
#

from typing import List

import asyncclick as click
import dagger
from pipelines import main_logger
from pipelines.airbyte_ci.connectors.build_image.steps import run_connector_build_pipeline
from pipelines.airbyte_ci.connectors.context import ConnectorContext
from pipelines.airbyte_ci.connectors.pipeline import run_connectors_pipelines
from pipelines.cli.dagger_pipeline_command import DaggerPipelineCommand
from pipelines.consts import BUILD_PLATFORMS, LOCAL_BUILD_PLATFORM


@click.command(cls=DaggerPipelineCommand, help="Build all images for the selected connectors.")
Expand All @@ -22,20 +17,10 @@
default=False,
type=bool,
)
@click.option(
"-a",
"--architecture",
"build_architectures",
help="Architecture for which to build the connector image. If not specified, the image will be built for the local architecture.",
multiple=True,
default=[LOCAL_BUILD_PLATFORM],
type=click.Choice(BUILD_PLATFORMS, case_sensitive=True),
)
@click.pass_context
async def build(ctx: click.Context, use_host_gradle_dist_tar: bool, build_architectures: List[str]) -> bool:
async def build(ctx: click.Context, use_host_gradle_dist_tar: bool) -> bool:
"""Runs a build pipeline for the selected connectors."""
build_platforms = [dagger.Platform(architecture) for architecture in build_architectures]
main_logger.info(f"Building connectors for {build_platforms}, use --architecture to change this.")

connectors_contexts = [
ConnectorContext(
pipeline_name=f"Build connector {connector.technical_name}",
Expand All @@ -56,7 +41,6 @@ async def build(ctx: click.Context, use_host_gradle_dist_tar: bool, build_archit
use_host_gradle_dist_tar=use_host_gradle_dist_tar,
s3_build_cache_access_key_id=ctx.obj.get("s3_build_cache_access_key_id"),
s3_build_cache_secret_key=ctx.obj.get("s3_build_cache_secret_key"),
targeted_platforms=build_platforms,
)
for connector in ctx.obj["selected_connectors_with_modified_files"]
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,17 @@

from __future__ import annotations

import platform

import anyio
from connector_ops.utils import ConnectorLanguage
from pipelines.airbyte_ci.connectors.build_image.steps import java_connectors, python_connectors
from pipelines.models.steps import StepResult
from pipelines.airbyte_ci.connectors.build_image.steps import python_connectors
from pipelines.airbyte_ci.connectors.build_image.steps.common import LoadContainerToLocalDockerHost, StepStatus
from pipelines.consts import LOCAL_BUILD_PLATFORM
from pipelines.airbyte_ci.connectors.build_image.steps import java_connectors
from pipelines.airbyte_ci.connectors.context import ConnectorContext
from pipelines.airbyte_ci.connectors.reports import ConnectorReport
from pipelines.models.steps import StepResult


class NoBuildStepForLanguageError(Exception):
Expand All @@ -37,18 +41,17 @@ async def run_connector_build_pipeline(context: ConnectorContext, semaphore: any
Args:
context (ConnectorContext): The initialized connector context.
semaphore (anyio.Semaphore): The semaphore to use to limit the number of concurrent builds.
Returns:
ConnectorReport: The reports holding builds results.
"""
step_results = []
async with semaphore:
async with context:
build_result = await run_connector_build(context)
per_platform_built_containers = build_result.output_artifact
step_results.append(build_result)
if context.is_local and build_result.status is StepStatus.SUCCESS:
load_image_result = await LoadContainerToLocalDockerHost(context, per_platform_built_containers).run()
load_image_result = await LoadContainerToLocalDockerHost(context, LOCAL_BUILD_PLATFORM, build_result.output_artifact).run()
step_results.append(load_image_result)
context.report = ConnectorReport(context, step_results, name="BUILD RESULTS")
return context.report
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,14 @@
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
#

import json
from abc import ABC
from typing import List, Tuple

import docker
from dagger import Container, ExecError, Platform, QueryError
from pipelines.airbyte_ci.connectors.context import ConnectorContext
from pipelines.helpers.utils import export_containers_to_tarball
from pipelines.consts import BUILD_PLATFORMS
from pipelines.helpers.utils import export_container_to_tarball
from pipelines.models.steps import Step, StepResult, StepStatus


Expand All @@ -22,8 +22,8 @@ class BuildConnectorImagesBase(Step, ABC):
def title(self):
return f"Build {self.context.connector.technical_name} docker image for platform(s) {', '.join(self.build_platforms)}"

def __init__(self, context: ConnectorContext) -> None:
self.build_platforms: List[Platform] = context.targeted_platforms
def __init__(self, context: ConnectorContext, *build_platforms: List[Platform]) -> None:
self.build_platforms = build_platforms if build_platforms else BUILD_PLATFORMS
super().__init__(context)

async def _run(self, *args) -> StepResult:
Expand Down Expand Up @@ -58,40 +58,26 @@ async def _build_connector(self, platform: Platform, *args) -> Container:
class LoadContainerToLocalDockerHost(Step):
IMAGE_TAG = "dev"

def __init__(self, context: ConnectorContext, containers: dict[Platform, Container]) -> None:
def __init__(self, context: ConnectorContext, platform: Platform, containers: dict[Platform, Container]) -> None:
super().__init__(context)
self.containers = containers
self.platform = platform
self.container = containers[platform]

@property
def title(self):
return f"Load {self.image_name}:{self.IMAGE_TAG} to the local docker host."
return f"Load {self.image_name}:{self.IMAGE_TAG} for platform {self.platform} to the local docker host."

@property
def image_name(self) -> Tuple:
return f"airbyte/{self.context.connector.technical_name}"

async def _run(self) -> StepResult:
container_variants = list(self.containers.values())
_, exported_tar_path = await export_containers_to_tarball(self.context, container_variants)
if not exported_tar_path:
return StepResult(
self,
StepStatus.FAILURE,
stderr=f"Failed to export the connector image {self.image_name}:{self.IMAGE_TAG} to a tarball.",
)
_, exported_tarball_path = await export_container_to_tarball(self.context, self.container)
client = docker.from_env()
try:
client = docker.from_env()
response = client.api.import_image_from_file(str(exported_tar_path), repository=self.image_name, tag=self.IMAGE_TAG)
try:
image_sha = json.loads(response)["status"]
except (json.JSONDecodeError, KeyError):
return StepResult(
self,
StepStatus.FAILURE,
stderr=f"Failed to import the connector image {self.image_name}:{self.IMAGE_TAG} to your Docker host: {response}",
)
return StepResult(
self, StepStatus.SUCCESS, stdout=f"Loaded image {self.image_name}:{self.IMAGE_TAG} to your Docker host ({image_sha})."
)
except docker.errors.DockerException as e:
return StepResult(self, StepStatus.FAILURE, stderr=f"Something went wrong while interacting with the local docker client: {e}")
with open(exported_tarball_path, "rb") as tarball_content:
new_image = client.images.load(tarball_content.read())[0]
new_image.tag(self.image_name, tag=self.IMAGE_TAG)
return StepResult(self, StepStatus.SUCCESS)
except ConnectionError:
return StepResult(self, StepStatus.FAILURE, stderr="The connection to the local docker host failed.")
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,13 @@
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
#

from dagger import Container, Directory, File, Platform, QueryError
from typing import List, Optional, Tuple, Union

from dagger import Container, Directory, ExecError, File, Host, Platform, QueryError
from pipelines.airbyte_ci.connectors.build_image.steps.common import BuildConnectorImagesBase
from pipelines.airbyte_ci.connectors.context import ConnectorContext
from pipelines.airbyte_ci.steps.gradle import GradleTask
from pipelines.consts import LOCAL_BUILD_PLATFORM
from pipelines.dagger.containers import java
from pipelines.models.steps import StepResult, StepStatus

Expand Down Expand Up @@ -53,7 +56,7 @@ async def run_connector_build(context: ConnectorContext) -> StepResult:
# Special case: use a local dist tar to speed up local development.
dist_dir = await context.dagger_client.host().directory(dist_tar_directory_path(context), include=["*.tar"])
# Speed things up by only building for the local platform.
return await BuildConnectorImages(context).run(dist_dir)
return await BuildConnectorImages(context, LOCAL_BUILD_PLATFORM).run(dist_dir)

# Default case: distribution tar is built by the dagger pipeline.
build_connector_tar_result = await BuildConnectorDistributionTar(context).run()
Expand Down

0 comments on commit 2bde54a

Please sign in to comment.