Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added e2e tests with snapshot and cdc- Draft PR #103

Open
wants to merge 14 commits into
base: develop
Choose a base branch
from
123 changes: 123 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,123 @@
# Copyright © 2023 Cask Data, Inc.
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

# This workflow will build a Java project with Maven
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into develop
name: Build e2e tests

on:
push:
branches: [ e2e ]
pull_request:
branches: [ e2e ]
types: [ opened, synchronize, reopened, labeled ]
workflow_dispatch:

jobs:
build:
runs-on: k8s-runner-e2e
# We allow builds:
# 1) When triggered manually
# 2) When it's a merge into a branch
# 3) For PRs that are labeled as build and
# - It's a code change
# - A build label was just added
# A bit complex, but prevents builds when other labels are manipulated
if: >
github.event_name == 'workflow_dispatch'
|| github.event_name == 'push'
|| (contains(github.event.pull_request.labels.*.name, 'build')
&& (github.event.action != 'labeled' || github.event.label.name == 'build')
)
strategy:
fail-fast: false

steps:
# Pinned 1.0.0 version
- uses: actions/checkout@v3
with:
path: plugin
submodules: 'recursive'
ref: ${{ github.event.workflow_run.head_sha }}

- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push'
id: filter
with:
working-directory: plugin
filters: |
e2e-test:
- '**/e2e-test/**'
- name: Checkout e2e test repo
uses: actions/checkout@v3
with:
repository: cdapio/cdap-e2e-tests
path: e2e

- name: Cache
uses: actions/cache@v3
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-${{ github.workflow }}
- name: Get Secrets from GCP Secret Manager
id: secrets
uses: 'google-github-actions/get-secretmanager-secrets@v0'
with:
secrets: |-
ORACLE_HOST:cdapio-github-builds/ORACLE_HOST
ORACLE_USERNAME:cdapio-github-builds/ORACLE_NORMAL_USERNAME
ORACLE_PASSWORD:cdapio-github-builds/ORACLE_NORMAL_PASSWORD
ORACLE_PORT:cdapio-github-builds/ORACLE_PORT
PROJECT_ID:cdapio-github-builds/PROJECT_ID_FOR_REPL
- name: Run required e2e tests
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' && steps.filter.outputs.e2e-test == 'false'
run: python3 e2e/src/main/scripts/run_e2e_test.py --testRunner TestRunnerRequired.java
env:
ORACLE_HOST: ${{ steps.secrets.outputs.ORACLE_HOST }}
ORACLE_USERNAME: ${{ steps.secrets.outputs.ORACLE_USERNAME }}
ORACLE_PASSWORD: ${{ steps.secrets.outputs.ORACLE_PASSWORD }}
ORACLE_PORT: ${{ steps.secrets.outputs.ORACLE_PORT }}


- name: Run all e2e tests
if: github.event_name == 'workflow_dispatch' || github.event_name == 'push' || steps.filter.outputs.e2e-test == 'true'
run: python3 e2e/src/main/scripts/run_e2e_test.py --testRunner TestRunner.java
env:
ORACLE_HOST: ${{ steps.secrets.outputs.ORACLE_HOST }}
ORACLE_USERNAME: ${{ steps.secrets.outputs.ORACLE_USERNAME }}
ORACLE_PASSWORD: ${{ steps.secrets.outputs.ORACLE_PASSWORD }}
ORACLE_PORT: ${{ steps.secrets.outputs.ORACLE_PORT }}
PROJECT_ID : ${{ steps.secrets.outputs.PROJECT_ID }}

- name: Upload report
uses: actions/upload-artifact@v3
if: always()
with:
name: Cucumber report
path: ./**/target/cucumber-reports

- name: Upload debug files
uses: actions/upload-artifact@v3
if: always()
with:
name: Debug files
path: ./**/target/e2e-debug

- name: Upload files to GCS
uses: google-github-actions/upload-cloud-storage@v0
if: always()
with:
path: ./plugin
destination: e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }}
glob: '**/target/cucumber-reports/**'
151 changes: 150 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.4.0</version>
<version>5.9.1</version>
<scope>test</scope>
</dependency>
<dependency>
Expand Down Expand Up @@ -385,6 +385,155 @@
</plugins>
</build>
</profile>
<profile>
<id>e2e-tests</id>
<properties>
<testSourceLocation>src/e2e-test/java</testSourceLocation>
<TEST_RUNNER>TestRunner.java</TEST_RUNNER>
</properties>
<build>
<testResources>
<testResource>
<directory>src/e2e-test/resources</directory>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.0</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>3.0.0</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>3.0.0</version>
</dependency>
</dependencies>
<configuration>
<includes>
<include>${TEST_RUNNER}</include>
</includes>
<!--Start configuration to run TestRunners in parallel-->
<parallel>classes</parallel> <!--Running TestRunner classes in parallel-->
<threadCount>2</threadCount> <!--Number of classes to run in parallel-->
<forkCount>2</forkCount> <!--Number of JVM processes -->
<reuseForks>true</reuseForks>
<!--End configuration to run TestRunners in parallel-->
<environmentVariables>
<GOOGLE_APPLICATION_CREDENTIALS>
${GOOGLE_APPLICATION_CREDENTIALS}
</GOOGLE_APPLICATION_CREDENTIALS>
<SERVICE_ACCOUNT_TYPE>
${SERVICE_ACCOUNT_TYPE}
</SERVICE_ACCOUNT_TYPE>
<SERVICE_ACCOUNT_FILE_PATH>
${SERVICE_ACCOUNT_FILE_PATH}
</SERVICE_ACCOUNT_FILE_PATH>
<SERVICE_ACCOUNT_JSON>
${SERVICE_ACCOUNT_JSON}
</SERVICE_ACCOUNT_JSON>
</environmentVariables>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
</goals>
</execution>
</executions>
</plugin>

<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>5.5.0</version>

<executions>
<execution>
<id>execution</id>
<phase>verify</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>Cucumber Reports</projectName> <!-- Replace with project name -->
<outputDirectory>target/cucumber-reports/advanced-reports</outputDirectory>
<buildNumber>1</buildNumber>
<skip>false</skip>
<inputDirectory>${project.build.directory}/cucumber-reports</inputDirectory>
<jsonFiles> <!-- supports wildcard or name pattern -->
<param>**/*.json</param>
</jsonFiles> <!-- optional, defaults to outputDirectory if not specified -->
<classificationDirectory>${project.build.directory}/cucumber-reports</classificationDirectory>
<checkBuildResult>true</checkBuildResult>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<id>add-test-source</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>${project.basedir}/${testSourceLocation}</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>

<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>31.1-jre</version>
</dependency>
</dependencies>
</dependencyManagement>

<dependencies>
<dependency>
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc8</artifactId>
<version>21.1.0.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cdap.tests.e2e</groupId>
<artifactId>cdap-e2e-framework</artifactId>
<version>0.3.0-SNAPSHOT</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.8</version>
<scope>runtime</scope>
</dependency>
</dependencies>

</profile>

</profiles>

</project>
Expand Down
51 changes: 51 additions & 0 deletions src/e2e-test/features/Pipeline.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
#
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be better to rename the file to SanityTests.feature to reflect the intent

# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
#

@Oracle
Feature: Oracle - Verify Oracle source data transfer to Big Query
@ENV_VARIABLES
Scenario: To verify replication of snapshot and cdc data from Oracle to Big Query successfully with Sanity test
Given Open DataFusion Project with replication to configure pipeline
When Enter input plugin property: "name" with value: "pipelineName"
And Click on the Next button
And Select Oracle as Source
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
Then Click plugin property: "region"
Then Click plugin property: "regionOption"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Replace input plugin property: "sid" with value: "dataset" for Credentials and Authorization related fields
Then Click on the Next button
Then Replace input plugin property: "loadInterval" with value: "loadInterval"
Then Click on the Next button
Then Validate Source table is available and select it
And Click on the Next button
And Click on the Next button
And Click on the Next button
Then Deploy the replication pipeline
And Run the replication Pipeline
Then Open the logs
And Wait till pipeline is in running state and check if no errors occurred
Then Verify expected Oracle records in target BigQuery table
And Insert a record in the source table and wait for replication
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we seperate out wait for replication part so that we can compose a scenario where multiple operations can be performed in one go and then we can wait once?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we then separate wait for every operation i.e., insert,update and delete or shall we combine operations ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can combine too as that achieves the same thing

Then Verify expected Oracle records in target BigQuery table
And Delete a record in the source table and wait for replication
Then Verify expected Oracle records in target BigQuery table
And Update a record in the source table and wait for replication
Then Verify expected Oracle records in target BigQuery table
And Capture raw logs
Then Close the pipeline logs and stop the pipeline