-
Notifications
You must be signed in to change notification settings - Fork 908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TEST] [DELTA] Refine CI and add test detla 1.0.0 with Spark 3.1 #643
Conversation
<groupId>*</groupId> | ||
<artifactId>*</artifactId> | ||
</exclusion> | ||
</exclusions> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's better to add some comments at this code place or add a todo
to remind update this in future ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
Codecov Report
@@ Coverage Diff @@
## master #643 +/- ##
=======================================
Coverage 79.87% 79.87%
=======================================
Files 120 120
Lines 4671 4671
Branches 562 562
=======================================
Hits 3731 3731
Misses 624 624
Partials 316 316 Continue to review full report at Codecov.
|
Travis failed with
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you @pan3793 for driving this.
.github/workflows/master.yml
Outdated
- '-DwildcardSuites=org.apache.kyuubi.operation.tpcds.TPCDSOutputSchemaSuite,org.apache.kyuubi.operation.tpcds.TPCDSDDLSuite -Dmaven.plugin.scalatest.exclude.tags=""' | ||
- '-Pspark-3.0 -Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-3.1.1 -Dspark.archive.name=spark-3.1.1-bin-hadoop2.7.tgz -Dmaven.plugin.scalatest.exclude.tags=org.apache.kyuubi.tags.ExtendedSQLTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest' | ||
- '-Pspark-3.1 -Pkyuubi-extension-spark_3.1' | ||
- '-Pspark-3.1 -Pkyuubi-extension-spark_3.1 -Pspark-hadoop-3.2' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, seems we need to backport into branch-1.2.0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
@@ -27,4 +27,4 @@ | |||
@TagAnnotation | |||
@Retention(RetentionPolicy.RUNTIME) | |||
@Target({ElementType.METHOD, ElementType.TYPE}) | |||
public @interface DataLakeTest {} | |||
public @interface DeltaTest {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a big issue, but shall we add a parent tag for DeltaTest
and IcebergTest
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we use tags here to mark tests which not support specific Spark versions, and due to the limitation of scalatest maven plugin, such as collection operation, the workflow scripts are not elegant now. Would you explain the benefit of adding a parent tag?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a strong opinion. I'm fine with both.
<groupId>*</groupId> | ||
<artifactId>*</artifactId> | ||
</exclusion> | ||
</exclusions> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's better to add some comments at this code place or add a todo
to remind update this in future ?
mistouch the close button : ( |
.github/workflows/master.yml
Outdated
- name: Code coverage | ||
if: ${{ matrix.profiles == '' }} | ||
run: bash <(curl -s https://codecov.io/bash) | ||
- name: Detected Dependency List Change | ||
if: ${{ ! contains(matrix.profiles, 'spark-3.2-snapshot') && ! contains(matrix.profiles, 'wildcardSuites') }} | ||
if: ${{ ! contains(matrix.profiles, 'spark-3.2-snapshot') && ! contains(matrix.profiles, 'tpcds.tests=true') }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of using matrix I prefer to move tpcds tests to another separate job like rat
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And not related to this PR, we can mv spark-3.2-snapshot
to a another yml
with nightly build enabled?
pom.xml
Outdated
</properties> | ||
</profile> | ||
|
||
<profile> | ||
<id>spark-3.1</id> | ||
<id>spark-3.2-snapshot</id> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark-master?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sounds good
.travis.yml
Outdated
- | | ||
if [[ "$JOB" == "tpcds-tests" ]]; then | ||
mvn clean install $PROFILE -DskipTests -pl :kyuubi-spark-sql-engine,:kyuubi-common,:kyuubi-ha,:kyuubi-zookeeper | ||
mvn test $PROFILE -Dtest=none -DwildcardSuites=org.apache.kyuubi.operation.tpcds.TPCDSOutputSchemaSuite,org.apache.kyuubi.operation.tpcds.TPCDSDDLSuite |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
org.apache.kyuubi.operation.tpcds.TPCDSOutputSchemaSuite,org.apache.kyuubi.operation.tpcds.TPCDSDDLSuite
can we rename them to xxxTPCDSSuite then -DwildcardSuites=*TPCDSSuite
for short?
.github/workflows/master.yml
Outdated
@@ -91,3 +85,43 @@ jobs: | |||
path: | | |||
**/target/unit-tests.log | |||
**/kyuubi-spark-sql-engine.log* | |||
|
|||
tpcds: | |||
name: TPC-DS Schema Tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not only schema? simply TPC-DS Tests
?
.github/workflows/master.yml
Outdated
key: ${{ runner.os }}-maven-io-${{ hashFiles('**/pom.xml') }} | ||
restore-keys: | | ||
${{ runner.os }}-maven-io- | ||
- name: Run TPC-DS Schema Tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
.travis.yml
Outdated
- EXCLUDE_TAGS="org.apache.kyuubi.tags.ExtendedSQLTest,org.apache.kyuubi.tags.DataLakeTest" | ||
- name: Test Kyuubi w/ -Pspark-3.1 -Pspark-hadoop-3.2 -Pkyuubi-extension-spark_3.1 | ||
- JOB="build" | ||
- name: Test Kyuubi w/ -Pspark-master w/ Spark 3.2 nightly build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove w/ Spark 3.2 nightly build
?
Thanks for review, got it merged for v1.3.0, and will send following PRs for 1. nightly build; 2. TPC-DS test against Delta |
Why are the changes needed?
-Pkyuubi-sql-spark_3.1
introduced by KYUUBI Add kyuubi-spark-extensions module #631 in GH actionHow was this patch tested?
Add some test cases that check the changes thoroughly including negative and positive cases if possible
Add screenshots for manual tests if appropriate
Run test locally before make a pull request