Skip to content

Commit

Permalink
🎉 New Source: Display&Video 360 (#11828)
Browse files Browse the repository at this point in the history
* Add dv360 connector source

* update query methods

* sanitize fields and fetch only fields in config_catalog

* Read and check methods + rmv extra spaces from schema fields

* add timezone in query + log error message in Read method

* start incr streamand tests

* Add unit tests and documentation

* rmv chunck date method

* Add incremental stream

* Add dv360 connector

* Add dv360 connector

* rmv .hbs from unit test file

* Add BOOTSTRAP.md file

* Delete airbyte-integrations/connectors/source-dv360 directory

Delete old folder

* Add Docs

* update config_Catalog

* filter last row in case of an additional empty row for the sum of the metrics + add required fields in unique_reach_audience stream

* Update read methode by removing summary row in the case of the standard report

* Update invalid_config and spec files

* Add first state msg in Read method

* Code format

* format connector

* add to seed file

* auto-bump connector version [ci skip]

Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
Co-authored-by: Marcos Marx <marcosmarxm@users.noreply.github.com>
  • Loading branch information
4 people committed Sep 28, 2022
1 parent 13bab6e commit 2aea783
Show file tree
Hide file tree
Showing 37 changed files with 6,148 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -253,6 +253,13 @@
icon: drift.svg
sourceType: api
releaseStage: alpha
- name: DV 360
sourceDefinitionId: 1356e1d9-977f-4057-ad4b-65f25329cf61
dockerRepository: airbyte/source-dv-360
dockerImageTag: 0.1.0
documentationUrl: https://docs.airbyte.io/integrations/sources/dv-360
sourceType: api
releaseStage: alpha
- name: E2E Testing
sourceDefinitionId: d53f9084-fa6b-4a5a-976c-5b8392f4ad8a
dockerRepository: airbyte/source-e2e-test
Expand Down
70 changes: 70 additions & 0 deletions airbyte-config/init/src/main/resources/seed/source_specs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2340,6 +2340,76 @@
oauthFlowOutputParameters:
- - "access_token"
- - "refresh_token"
- dockerImage: "airbyte/source-dv-360:0.1.0"
spec:
documentationUrl: "https://docsurl.com"
connectionSpecification:
$schema: "http://json-schema.org/draft-07/schema#"
title: "Display & Video 360 Spec"
type: "object"
required:
- "credentials"
- "partner_id"
- "start_date"
additionalProperties: true
properties:
credentials:
type: "object"
description: "Oauth2 credentials"
order: 0
required:
- "access_token"
- "refresh_token"
- "token_uri"
- "client_id"
- "client_secret"
properties:
access_token:
type: "string"
description: "Access token"
airbyte_secret: true
refresh_token:
type: "string"
description: "Refresh token"
airbyte_secret: true
token_uri:
type: "string"
description: "Token URI"
airbyte_secret: true
client_id:
type: "string"
description: "Client ID"
airbyte_secret: true
client_secret:
type: "string"
description: "Client secret"
airbyte_secret: true
partner_id:
type: "integer"
description: "Partner ID"
order: 1
start_date:
type: "string"
description: "UTC date and time in the format 2017-01-25. Any data before\
\ this date will not be replicated"
pattern: "^[0-9]{4}-[0-9]{2}-[0-9]{2}$"
order: 2
end_date:
type: "string"
description: "UTC date and time in the format 2017-01-25. Any data after\
\ this date will not be replicated."
pattern: "^[0-9]{4}-[0-9]{2}-[0-9]{2}$"
order: 3
filters:
type: "array"
description: "filters for the dimensions. each filter object had 2 keys:\
\ 'type' for the name of the dimension to be used as. and 'value' for\
\ the value of the filter"
default: []
order: 4
supportsNormalization: false
supportsDBT: false
supported_destination_sync_modes: []
- dockerImage: "airbyte/source-e2e-test:2.1.1"
spec:
documentationUrl: "https://docs.airbyte.io/integrations/sources/e2e-test"
Expand Down
6 changes: 6 additions & 0 deletions airbyte-integrations/connectors/source-dv-360/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
*
!Dockerfile
!main.py
!source_dv_360
!setup.py
!secrets
17 changes: 17 additions & 0 deletions airbyte-integrations/connectors/source-dv-360/BOOTSTRAP.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Display & Video 360

Google DoubleClick Bid Manager (DBM) is the API that enables developers to manage Queries and retrieve Reports from Display & Video 360.

DoubleClick Bid Manager API `v1.1` is the latest available and recommended version.

[Link](https://developers.google.com/bid-manager/v1.1) to the official documentation.

[Getting started with the API](https://developers.google.com/bid-manager/guides/getting-started-api)

**Workflow of the API**:
* In order to fetch data from the DBM API, it is necessary to first build a [query](https://developers.google.com/bid-manager/v1.1/queries) that gets created in the [user interface (UI)](https://www.google.com/ddm/bidmanager/).
* Once the query is created it can be executed, and the resulting [report](https://developers.google.com/bid-manager/v1.1/reports) can be found and downloaded in the UI.

**Filters and Metrics**: Dimensions are referred to as Filters in DV360. All available dimensions metrics can be found [here](https://developers.google.com/bid-manager/v1.1/filters-metrics).

**Note**: It is recommended in the reporting [best practices](https://developers.google.com/bid-manager/guides/scheduled-reports/best-practices) to first build the desired report in the UI to avoid any errors, since there are several limilations and requirements pertaining to reporting types, filters, dimensions, and metrics (such as valid combinations of metrics and dimensions).
38 changes: 38 additions & 0 deletions airbyte-integrations/connectors/source-dv-360/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
FROM python:3.7.11-alpine3.14 as base

# build and load all requirements
FROM base as builder
WORKDIR /airbyte/integration_code

# upgrade pip to the latest version
RUN apk --no-cache upgrade \
&& pip install --upgrade pip \
&& apk --no-cache add tzdata build-base


COPY setup.py ./
# install necessary packages to a temporary folder
RUN pip install --prefix=/install .

# build a clean environment
FROM base
WORKDIR /airbyte/integration_code

# copy all loaded and built libraries to a pure basic image
COPY --from=builder /install /usr/local
# add default timezone settings
COPY --from=builder /usr/share/zoneinfo/Etc/UTC /etc/localtime
RUN echo "Etc/UTC" > /etc/timezone

# bash is installed for more convenient debugging.
RUN apk --no-cache add bash

# copy payload code only
COPY main.py ./
COPY source_dv_360 ./source_dv_360

ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]

LABEL io.airbyte.version=0.1.0
LABEL io.airbyte.name=airbyte/source-dv-360
129 changes: 129 additions & 0 deletions airbyte-integrations/connectors/source-dv-360/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
# DISPLAY & VIDEO 360 Source

This is the repository for the DISPLAY & VIDEO 360 source connector, written in Python.
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/dv360).

## Local development

### Prerequisites
**To iterate on this connector, make sure to complete this prerequisites section.**

#### Minimum Python version required `= 3.7.0`

#### Build & Activate Virtual Environment and install dependencies
From this connector directory, create a virtual environment:
```
python -m venv .venv
```

This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:
```
source .venv/bin/activate
pip install -r requirements.txt
```
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.

Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
should work as you expect.

#### Building via Gradle
From the Airbyte repository root, run:
```
./gradlew :airbyte-integrations:connectors:source-dv-360:build
```

#### Create credentials
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/dv360)
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_dv_360/spec.json` file.
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
See `integration_tests/sample_config.json` for a sample config file.

**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source dv360 test creds`
and place them into `secrets/config.json`.

### Locally running the connector
```
python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
```

### Locally running the connector docker image

#### Build
First, make sure you build the latest Docker image:
```
docker build . -t airbyte/source-dv-360:dev
```

You can also build the connector image via Gradle:
```
./gradlew :airbyte-integrations:connectors:source-dv-360:airbyteDocker
```
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
the Dockerfile.

#### Run
Then run any of the connector commands as follows:
```
docker run --rm airbyte/source-dv-360:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dv-360:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dv-360:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-dv-360:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
```
## Testing
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
First install test dependencies into your virtual environment:
```
pip install .[tests]
```
### Unit Tests
To run unit tests locally, from the connector directory run:
```
python -m pytest unit_tests
```

### Integration Tests
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector).
#### Custom Integration tests
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
```
python -m pytest integration_tests
```
#### Acceptance Tests
Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference) for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
To run your integration tests with acceptance tests, from the connector root, run
```
python -m pytest integration_tests -p integration_tests.acceptance
```
To run your integration tests with docker

### Using gradle to run tests
All commands should be run from airbyte project root.
To run unit tests:
```
./gradlew :airbyte-integrations:connectors:source-dv-360:unitTest
```
To run acceptance and custom integration tests:
```
./gradlew :airbyte-integrations:connectors:source-dv-360:integrationTest
```

## Dependency Management
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
* required for the testing need to go to `TEST_REQUIREMENTS` list

### Publishing a new version of the connector
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing unit and integration tests.
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
1. Create a Pull Request.
1. Pat yourself on the back for being an awesome contributor.
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference)
# for more information about how to configure these tests
connector_image: airbyte/source-dv-360:dev
tests:
spec:
- spec_path: "source_dv_360/spec.json"
connection:
- config_path: "secrets/config.json"
status: "succeed"
- config_path: "integration_tests/invalid_config.json"
status: "failed"
discovery:
- config_path: "secrets/config.json"
basic_read:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
empty_streams: []
# TODO uncomment this block to specify that the tests should assert the connector outputs the records provided in the input file a file
# expect_records:
# path: "integration_tests/expected_records.txt"
# extra_fields: no
# exact_order: no
# extra_records: yes
incremental: # TODO if your connector does not implement incremental sync, remove this block
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
future_state_path: "integration_tests/abnormal_state.json"
full_refresh:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/usr/bin/env sh

# Build latest connector image
docker build . -t $(cat acceptance-test-config.yml | grep "connector_image" | head -n 1 | cut -d: -f2)

# Pull latest acctest image
docker pull airbyte/source-acceptance-test:latest

# Run
docker run --rm -it \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /tmp:/tmp \
-v $(pwd):/test_input \
airbyte/source-acceptance-test \
--acceptance-test-config /test_input

9 changes: 9 additions & 0 deletions airbyte-integrations/connectors/source-dv-360/build.gradle
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
plugins {
id 'airbyte-python'
id 'airbyte-docker'
id 'airbyte-source-acceptance-test'
}

airbytePython {
moduleDirectory 'source_dv_360_singer'
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
#
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"standard": {
"date": "2224-01-01"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#


import pytest

pytest_plugins = ("source_acceptance_test.plugin",)


@pytest.fixture(scope="session", autouse=True)
def connector_setup():
"""This fixture is a placeholder for external resources that acceptance test might require."""
# TODO: setup test dependencies
yield
# TODO: clean up test dependencies

0 comments on commit 2aea783

Please sign in to comment.