Skip to content

Commit

Permalink
馃帀 New Source: DataScope [low-code cdk] (#18725)
Browse files Browse the repository at this point in the history
* adding datascope connector

* fixing tests

* fixing readme

* cleanup

* adding incremental streams

* correct tests

* fixing non incremental streams

* dropping incremental for some streams

* update date cursor for incremental

* rename parameters

* correct integrqation test files

* remove files

* update start date example

* add datascope to source def

* add eof

* auto-bump connector version

Co-authored-by: marcosmarxm <marcosmarxm@gmail.com>
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com>
  • Loading branch information
3 people authored and akashkulk committed Nov 17, 2022
1 parent b891f17 commit 8a92b03
Show file tree
Hide file tree
Showing 29 changed files with 718 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -314,6 +314,13 @@
documentationUrl: https://docs.airbyte.com/integrations/sources/datadog
sourceType: api
releaseStage: alpha
- name: Datascope
sourceDefinitionId: 8e1ae2d2-4790-44d3-9d83-75b3fc3940ff
dockerRepository: airbyte/source-datascope
dockerImageTag: 0.1.0
documentationUrl: https://docs.airbyte.com/integrations/sources/datascope
sourceType: api
releaseStage: alpha
- name: Delighted
sourceDefinitionId: cc88c43f-6f53-4e8a-8c4d-b284baaf9635
dockerRepository: airbyte/source-delighted
Expand Down
27 changes: 27 additions & 0 deletions airbyte-config/init/src/main/resources/seed/source_specs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2697,6 +2697,33 @@
supportsNormalization: false
supportsDBT: false
supported_destination_sync_modes: []
- dockerImage: "airbyte/source-datascope:0.1.0"
spec:
documentationUrl: "https://docs.airbyte.com/integrations/sources/datascope"
connectionSpecification:
$schema: "http://json-schema.org/draft-07/schema#"
title: "Datascope Spec"
type: "object"
required:
- "api_key"
- "start_date"
additionalProperties: true
properties:
start_date:
title: "Start Date"
type: "string"
description: "Start date for the data to be replicated"
examples:
- "dd/mm/YYYY HH:MM"
pattern: "^[0-9]{2}/[0-9]{2}/[0-9]{4} [0-9]{2}:[0-9]{2}$"
api_key:
title: "Authorization"
type: "string"
description: "API Key"
airbyte_secret: true
supportsNormalization: false
supportsDBT: false
supported_destination_sync_modes: []
- dockerImage: "airbyte/source-delighted:0.1.4"
spec:
documentationUrl: "https://docsurl.com"
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
*
!Dockerfile
!main.py
!source_datascope
!setup.py
!secrets
10 changes: 10 additions & 0 deletions airbyte-integrations/connectors/source-datascope/BOOTSTRAP.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# DataScope
DataScope is a mobile solution that helps you collect data offline, manage field teams, and share business insights. Use the intuitive Form Builder to create your forms, and then analyze the data you've collected via powerful and personalized dashboards.

The streams implemented allows you to pull data from the following DataScope objects:
- Locations
- Answers
- Lists
- Notifications

For more information about the DataScope API, see the [DataScope API documentation](https://dscope.github.io/docs/).
38 changes: 38 additions & 0 deletions airbyte-integrations/connectors/source-datascope/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
FROM python:3.9.11-alpine3.15 as base

# build and load all requirements
FROM base as builder
WORKDIR /airbyte/integration_code

# upgrade pip to the latest version
RUN apk --no-cache upgrade \
&& pip install --upgrade pip \
&& apk --no-cache add tzdata build-base


COPY setup.py ./
# install necessary packages to a temporary folder
RUN pip install --prefix=/install .

# build a clean environment
FROM base
WORKDIR /airbyte/integration_code

# copy all loaded and built libraries to a pure basic image
COPY --from=builder /install /usr/local
# add default timezone settings
COPY --from=builder /usr/share/zoneinfo/Etc/UTC /etc/localtime
RUN echo "Etc/UTC" > /etc/timezone

# bash is installed for more convenient debugging.
RUN apk --no-cache add bash

# copy payload code only
COPY main.py ./
COPY source_datascope ./source_datascope

ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]

LABEL io.airbyte.version=0.1.0
LABEL io.airbyte.name=airbyte/source-datascope
79 changes: 79 additions & 0 deletions airbyte-integrations/connectors/source-datascope/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Datascope Source

This is the repository for the Datascope configuration based source connector.
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/datascope).

## Local development

#### Building via Gradle
You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.

To build using Gradle, from the Airbyte repository root, run:
```
./gradlew :airbyte-integrations:connectors:source-datascope:build
```

#### Create credentials
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/datascope)
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_datascope/spec.yaml` file.
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
See `integration_tests/sample_config.json` for a sample config file.

**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source datascope test creds`
and place them into `secrets/config.json`.

### Locally running the connector docker image

#### Build
First, make sure you build the latest Docker image:
```
docker build . -t airbyte/source-datascope:dev
```

You can also build the connector image via Gradle:
```
./gradlew :airbyte-integrations:connectors:source-datascope:airbyteDocker
```
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
the Dockerfile.

#### Run
Then run any of the connector commands as follows:
```
docker run --rm airbyte/source-datascope:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datascope:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datascope:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-datascope:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
```
## Testing

#### Acceptance Tests
Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference) for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.

To run your integration tests with docker

### Using gradle to run tests
All commands should be run from airbyte project root.
To run unit tests:
```
./gradlew :airbyte-integrations:connectors:source-datascope:unitTest
```
To run acceptance and custom integration tests:
```
./gradlew :airbyte-integrations:connectors:source-datascope:integrationTest
```

## Dependency Management
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
* required for the testing need to go to `TEST_REQUIREMENTS` list

### Publishing a new version of the connector
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing unit and integration tests.
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
1. Create a Pull Request.
1. Pat yourself on the back for being an awesome contributor.
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
3 changes: 3 additions & 0 deletions airbyte-integrations/connectors/source-datascope/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# See [Source Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/source-acceptance-tests-reference)
# for more information about how to configure these tests
connector_image: airbyte/source-datascope:dev
tests:
spec:
- spec_path: "source_datascope/spec.yaml"
connection:
- config_path: "secrets/config.json"
status: "succeed"
- config_path: "integration_tests/invalid_config.json"
status: "failed"
discovery:
- config_path: "secrets/config.json"
basic_read:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
empty_streams: ["notifications", "lists"]
incremental:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
future_state_path: "integration_tests/abnormal_state.json"
full_refresh:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/usr/bin/env sh

# Build latest connector image
docker build . -t $(cat acceptance-test-config.yml | grep "connector_image" | head -n 1 | cut -d: -f2-)

# Pull latest acctest image
docker pull airbyte/source-acceptance-test:latest

# Run
docker run --rm -it \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /tmp:/tmp \
-v $(pwd):/test_input \
airbyte/source-acceptance-test \
--acceptance-test-config /test_input

9 changes: 9 additions & 0 deletions airbyte-integrations/connectors/source-datascope/build.gradle
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
plugins {
id 'airbyte-python'
id 'airbyte-docker'
id 'airbyte-source-acceptance-test'
}

airbytePython {
moduleDirectory 'source_datascope'
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"answers": {
"created_at": "01/01/9999 00:00"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#


import pytest

pytest_plugins = ("source_acceptance_test.plugin",)


@pytest.fixture(scope="session", autouse=True)
def connector_setup():
"""This fixture is a placeholder for external resources that acceptance test might require."""
yield
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
{
"streams": [
{
"stream": {
"name": "locations",
"json_schema": {},
"source_defined_cursor": true,
"source_defined_primary_key": [["id"]],
"supported_sync_modes": ["full_refresh"]
},
"sync_mode": "incremental",
"destination_sync_mode": "append"
},
{
"stream": {
"name": "answers",
"json_schema": {},
"supported_sync_modes": [
"full_refresh",
"incremental"]},
"sync_mode": "incremental",
"destination_sync_mode": "overwrite"
},
{
"stream": {
"name": "lists",
"json_schema": {},
"supported_sync_modes": ["full_refresh"]
},
"sync_mode": "full_refresh",
"destination_sync_mode": "overwrite"
},
{
"stream": {
"name": "notifications",
"json_schema": {},
"supported_sync_modes": ["full_refresh"]
},
"sync_mode": "full_refresh",
"destination_sync_mode": "overwrite"
}
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"api_key": "abctestconfig",
"start_date": "2019-01-01T00:00:00Z"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"api_key": "test_key",
"start_date": "2022-10-30 00:00"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"answers": {
"created_at": "01/01/2000 00:00"
}
}
13 changes: 13 additions & 0 deletions airbyte-integrations/connectors/source-datascope/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#


import sys

from airbyte_cdk.entrypoint import launch
from source_datascope import SourceDatascope

if __name__ == "__main__":
source = SourceDatascope()
launch(source, sys.argv[1:])
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
-e ../../bases/source-acceptance-test
-e .
29 changes: 29 additions & 0 deletions airbyte-integrations/connectors/source-datascope/setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#


from setuptools import find_packages, setup

MAIN_REQUIREMENTS = [
"airbyte-cdk~=0.1",
]

TEST_REQUIREMENTS = [
"pytest~=6.1",
"pytest-mock~=3.6.1",
"source-acceptance-test",
]

setup(
name="source_datascope",
description="Source implementation for Datascope.",
author="Airbyte",
author_email="contact@airbyte.io",
packages=find_packages(),
install_requires=MAIN_REQUIREMENTS,
package_data={"": ["*.json", "*.yaml", "schemas/*.json", "schemas/shared/*.json"]},
extras_require={
"tests": TEST_REQUIREMENTS,
},
)
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#


from .source import SourceDatascope

__all__ = ["SourceDatascope"]
Loading

0 comments on commit 8a92b03

Please sign in to comment.