Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BEAM-7899] Fix Python Dataflow VR tests by specify sdk_location #9269

Merged
merged 1 commit into from Aug 6, 2019

Conversation

markflyhigh
Copy link
Contributor

We forgot to explicitly set sdk_location for Python2 dataflow VR tests after moved them to test-suites/dataflow/py2. The default value build/apache-beam.tar.gz is still in use which is incorrect.

Following change should fix the failure.

+R: @udim


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Choose reviewer(s) and mention them in a comment (R: @username).
  • Format the pull request title like [BEAM-XXX] Fixes bug in ApproximateQuantiles, where you replace BEAM-XXX with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

Post-Commit Tests Status (on master branch)

Lang SDK Apex Dataflow Flink Gearpump Samza Spark
Go Build Status --- --- Build Status --- --- Build Status
Java Build Status Build Status Build Status Build Status
Build Status
Build Status
Build Status Build Status Build Status
Build Status
Python Build Status
Build Status
Build Status
Build Status
--- Build Status
Build Status
Build Status --- --- Build Status

Pre-Commit Tests Status (on master branch)

--- Java Python Go Website
Non-portable Build Status Build Status Build Status Build Status
Portable --- Build Status --- ---

See .test-infra/jenkins/README for trigger phrase, status and link of all Jenkins jobs.

@markflyhigh markflyhigh requested a review from udim August 5, 2019 22:37
@markflyhigh
Copy link
Contributor Author

Run Python Dataflow ValidatesRunner

Copy link
Member

@udim udim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

beam_PostCommit_Py_VR_Dataflow_PR is now failing on ImportError: No module named 'beam_runner_api_pb2', but that's ok.
Waiting for python 2 postcommit to pass.

@@ -70,6 +70,7 @@ task postCommitIT(dependsOn: ['installGcpTest', 'sdist']) {
def testOpts = basicTestOpts + ["--attr=IT"]
def cmdArgs = project.mapToArgString(["test_opts": testOpts,
"worker_jar": dataflowWorkerJar,
"sdk_location": "${project.buildDir}/apache-beam.tar.gz",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This affects Python 2 postcommits, but that job was not broken.
How did it work?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default path build/apache-beam.tar.gz is used which is output of :sdks:python:sdist and happens to be a dependency task of PostCommit_Py2. If you look at timeline of this job, sdks:python:sdist is executed before postCommitIT so when postCommitIT runs, the tarball is ready.

@udim
Copy link
Member

udim commented Aug 6, 2019

run python 2 postcommit

@markflyhigh
Copy link
Contributor Author

python2 postcommit passed https://builds.apache.org/job/beam_PostCommit_Python2_PR/55/.

PTAL @udim

Copy link
Member

@udim udim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks

@markflyhigh markflyhigh merged commit ff0f308 into apache:master Aug 6, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants