Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure pipeline #262

Merged
merged 44 commits into from
Jul 4, 2019
Merged

Azure pipeline #262

merged 44 commits into from
Jul 4, 2019

Conversation

gusmith
Copy link
Contributor

@gusmith gusmith commented Jun 27, 2019

It should be able to replace appveyor:

Python 3.4 is currently not tested by the azure pipeline because it requires to install another c compiler which is quite hard to install via command line (and without having access to a windows machine to try locally). But in the documentation of clkhash, it is also stated that it supports python version only from 3.5 (and 2.7).

If you would like some docs, where should I include some?

Can replace appveyor.
@gusmith gusmith requested a review from hardbyte June 27, 2019 01:58
@gusmith gusmith self-assigned this Jun 27, 2019
@gusmith
Copy link
Contributor Author

gusmith commented Jun 27, 2019

Codecov is working weirdly... It seems to receive all the time the report, but does not do anything about it, does not show it... It's quite random...
It may actually be pushed by travis.
I don't know if it is a good idea to push multiple time the same report from different CI.

@hardbyte
Copy link
Collaborator

Yeah I'm not sure what is up with codecov - I thought it is supposed to handle reports from different CI systems to cover testing different operating systems and python versions. In the meantime the codecov status check is no longer required.

Copy link
Collaborator

@hardbyte hardbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great Guillaume!

Regarding docs I think we should add a new development or devops page.

@@ -0,0 +1,104 @@
jobs:
- job: 'Test'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At the moment this job appears to test and package and publish... and just on Windows.

displayName: 'Download Microsoft Visual C++ 9.0 from https://download.microsoft.com/download/7/9/6/796EF2E4-801B-4FC4-AB28-B59FBF6D907B/VCForPython27.msi'
condition: eq(variables['python.version'], 2.7)
- powershell: Start-Process VCForPython27.msi -ArgumentList "/q" -Wait
displayName: 'Install Microsoft Visual C++ 9.0.'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this required? I'm guessing for installing bitarray? #153

If so I think we should add a note in the docs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may indeed be related to bitarray but I cannot confirm it is the only requirements requiring it:

Building wheels for collected packages: bashplotlib, bitarray, future, functools32, pefile, pycparser, tornado, simplegeneric, win-unicode-console
  Building wheel for bashplotlib (setup.py): started
  Building wheel for bashplotlib (setup.py): finished with status 'done'
  Stored in directory: C:\Users\VssAdministrator\AppData\Local\pip\Cache\wheels\39\f0\8b\6e42f59f24ca2115413dc1a8195a2fb82f9daeed52839aa631
  Building wheel for bitarray (setup.py): started
  Building wheel for bitarray (setup.py): finished with status 'error'
  Running setup.py clean for bitarray
  ERROR: Complete output from command 'C:\hostedtoolcache\windows\Python\2.7.14\x64\python.exe' -u -c 'import setuptools, tokenize;__file__='"'"'c:\\users\\vssadm~1\\appdata\\local\\temp\\pip-install-omoklv\\bitarray\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'c:\users\vssadm~1\appdata\local\temp\pip-wheel-wal369' --python-tag cp27:
  ERROR: running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-2.7
  creating build\lib.win-amd64-2.7\bitarray
  copying bitarray\test_bitarray.py -> build\lib.win-amd64-2.7\bitarray
  copying bitarray\__init__.py -> build\lib.win-amd64-2.7\bitarray
  running build_ext
  building 'bitarray._bitarray' extension
  error: Microsoft Visual C++ 9.0 is required. Get it from http://aka.ms/vcpython27

# Add additional tasks to run using each Python version in the matrix above
- script: |
python --version
python -c "import struct; print(struct.calcsize('P') * 8)"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can remove this whole task - the struct check isn't required anymore and the UsePythonVersion@0 prints the Python version.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

jobs:
- job: 'Test'
pool:
vmImage: 'vs2017-win2016'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
vmImage: 'vs2017-win2016'
vmImage: $(imageName)

Consider adding support for OSX as well as Windows. https://docs.microsoft.com/en-us/azure/devops/pipelines/get-started-multiplatform?view=azure-devops#add-additional-platforms

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not originally part of the spec, but easy change :)

env:
INCLUDE_CLI: 1
- script: |
python -m pip install --upgrade pip
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just for consistency with the following commands use -U instead of --upgrade

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

- script: python -m codecov --token $(CODECOV_TOKEN) --file coverageReport.xml
displayName: 'Send coverage to codecov'
condition: and(eq(variables['python.version'], '3.7'), eq(variables['architecture'], 'x86'))
- script: |
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get the feeling that this should be a new deployment stage?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I thought about it but creating a new job or stage would imply re-installing all the dependencies (as each job uses a different VM). So I chose speed over prettiness of stages/jobs/tasks. But happy to change if preferred.

condition: and(eq(variables['python.version'], '3.7'), eq(variables['architecture'], 'x86'))
- script: |
pip install twine
twine upload -u $(PYPI_LOGIN) -p $(PYPI_PASSWORD) --skip-existing dist/*.whl
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm actually not sure when the auth task is useful as we still need to run the twine commands after...

testResultsFiles: 'testResults.xml'
testRunTitle: 'Test results python$(python.version), $(architecture)'
failTaskOnFailedTests: true
- script: python -m codecov --token $(CODECOV_TOKEN) --file coverageReport.xml
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

env:
INCLUDE_CLI: 1
- script: python -m pytest --cov=clkhash --junitxml=testResults.xml --cov-report=xml:coverageReport.xml
displayName: 'Run the tests'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
displayName: 'Run the tests'
displayName: 'Test with pytest for Python $(python.version) ($(architecture))'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this is useful, as this step is already part of the job stating the operating system and all.
But note that the test result include them, making the test reports aggregator easy to read.

inputs:
testResultsFormat: 'JUnit'
testResultsFiles: 'testResults.xml'
testRunTitle: 'Test results python$(python.version), $(architecture)'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
testRunTitle: 'Test results python$(python.version), $(architecture)'
testRunTitle: 'Publish test results for Python $(python.version) ($(architecture))'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will also include the operating system, but I will not add "Publish" at the beginning: this name us used for the test result report name, not for the task of publishing. This name is then shown in the test tab of the build: https://dev.azure.com/data61/Anonlink/_build/results?buildId=40&view=ms.vss-test-web.build-test-results-tab To see it, you need to include the passed tests as by default it shows only the failed tests.

python -m pip install -U wheel setuptools codecov
python -m pip install -U -r requirements.txt
python -m pip install .
displayName: 'Pip installs'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
displayName: 'Pip installs'
displayName: 'Install dependencies'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@codecov
Copy link

codecov bot commented Jun 27, 2019

Codecov Report

Merging #262 into master will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##           master     #262   +/-   ##
=======================================
  Coverage   93.98%   93.98%           
=======================================
  Files          17       17           
  Lines        1198     1198           
=======================================
  Hits         1126     1126           
  Misses         72       72

@gusmith
Copy link
Contributor Author

gusmith commented Jun 27, 2019

OK, I've arrived at a stage it is working back, and where it is super easy to add/remove python version, os, and architecture.
But, it required creating a template (because normal variables cannot be lists) to use their parameters which can be lists.
But because Azure pipeline is fun, it is written there that "for service-side operations such as setting display names, variables are not expanded recursively", which make the creation of the display name a pleasure. So currently, because the job as a set name, the server is adding the name of the element of the matrix which is tested. I do not think (but haven't properly tested) that such a name cannot include spaces. It is now rendering as Test and publish for vs2017-win2016_Python_3.7_x64.
The coverage is published for each build, but the rendering is not showing that... https://dev.azure.com/data61/Anonlink/_build/results?buildId=65&view=codecoverage-tab

But since we added sdist, a new artifact is created (a tar.gz). Should it be send to pypi using twine too? I had to filter only whl file as there were some error when trying to send a .exe file...

@gusmith
Copy link
Contributor Author

gusmith commented Jun 27, 2019

I'm also not really happy with the warning when publishing the coverage, but would you prefer to use the html created by pytest or the one created by the Azure step? The one currently shown is the one from Azure.

@gusmith
Copy link
Contributor Author

gusmith commented Jun 27, 2019

PS: @hardbyte sorry it may be hard to review again as I moved the whole code into a new file...
And when happy with these changes, I'll start working on documenting it. So this is not the final commit for this PR :)

@gusmith gusmith requested a review from hardbyte June 27, 2019 06:45
pip install twine
twine upload -u $(PYPI_LOGIN) -p $(PYPI_PASSWORD) --skip-existing dist/*.whl
displayName: 'Send artifact to Pypi'
condition: and(eq(variables['pythonVersion'], '3.7'), eq(variables['architecture'], 'x86'))
Copy link
Collaborator

@hardbyte hardbyte Jun 27, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should only publish to pypi on tagged commit, or for now perhaps only when manually chosen?

Now that you're publishing the pipeline artifact can this easily be moved into own job?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue with tagged commits is that the tagging is usually happening after the commit have been pushed. So to still do that and have a manual confirmation, I had to create a release pipeline which should be triggered after a tagged version is built from master.


- script: python setup.py sdist bdist_wheel
displayName: 'Package'
condition: and(eq(variables['pythonVersion'], '3.7'), eq(variables['architecture'], 'x86'))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we package for every combo, and move this up (after installing requirements). we can remove line 43 (python -m pip install .) and instead test the packaged version.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed, but could you check that it is doing what you would like? I didn't provide to pytest the dist, and it's still working.

@gusmith
Copy link
Contributor Author

gusmith commented Jun 28, 2019

The only part not tested now if the release pipeline, which should be automatically triggered when we will release a new version (i.e. creating a new tag starting with a v on master).

@gusmith gusmith requested a review from hardbyte June 28, 2019 07:39
@hardbyte
Copy link
Collaborator

But since we added sdist, a new artifact is created (a tar.gz). Should it be send to pypi using twine too? I had to filter only whl file as there were some error when trying to send a .exe file...

Yes, twine should support both the tar.gz and whl files.

I'm also not really happy with the warning when publishing the coverage, but would you prefer to use the html created by pytest or the one created by the Azure step? The one currently shown is the one from Azure.

I don't have any preference between the html. Are you talking about this error? That looks to be from calling codecov too frequently?

Copy link
Collaborator

@hardbyte hardbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll approve now, but need to test the release pipeline before all done. Just set the version in setup.py and edit the this pre-release to tag your branch: https://github.com/data61/clkhash/releases/tag/v0.13.1-b2

By the way you can release on "any" tag - doesn't have to start with v (that's what I've been doing in AppVeyor).


- task: PublishTestResults@2
displayName: 'Publish test results in Azure'
condition: succeeded()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean it won't publish test results if they don't all pass?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well seen :)

@gusmith
Copy link
Contributor Author

gusmith commented Jul 1, 2019

But since we added sdist, a new artifact is created (a tar.gz). Should it be send to pypi using twine too? I had to filter only whl file as there were some error when trying to send a .exe file...

Yes, twine should support both the tar.gz and whl files.

That is what I thought. So twine will try to publish all the .whl and tar.gz published by a build which triggers the release pipeline (the twine step is now only in the release pipeline).

I'm also not really happy with the warning when publishing the coverage, but would you prefer to use the html created by pytest or the one created by the Azure step? The one currently shown is the one from Azure.

I don't have any preference between the html. Are you talking about this error? That looks to be from calling codecov too frequently?

I was not pointing to this warning which I didn't see (because not surfaced by the pipeline). The warning was not seen for quite a few builds because I commented out the line creating them, adding a comment in the code where it is.
The warning you are point out could indeed be an issue of login too often. I'll fix that by creating all the reports as different artifacts, and publish them all at once at the end.

@gusmith
Copy link
Contributor Author

gusmith commented Jul 1, 2019

OK, so at this stage, the automatic triggering of the release pipeline works.
Just to confirm, the Build pipeline is triggered:

  • for every commit no master
  • for every commit part of a PR going into any branch
  • for every tag
    The release pipeline is triggered automatically after the build pipeline succeeds
  • for every tag.

The release pipeline can also be triggered manually, in which case it is asking for the requester to choose a specific artifact to release. An artifact being a result of the build pipeline.

Also note that in the release pipeline, there is a manual check: @gusmith and @hardbyte receive an email stating that they need to resume the build. They can also cancel it if required.

Guillaume Smith added 5 commits July 1, 2019 15:09
I do not know if the one running with python 2.7 includes it, but we are not publishing it so not really important.
@gusmith
Copy link
Contributor Author

gusmith commented Jul 1, 2019

Finally!!! This is working end to end with the README file well displayed in Pypi: https://pypi.org/project/clkhash/0.13.1b6/

@gusmith
Copy link
Contributor Author

gusmith commented Jul 2, 2019

Just to be sure it suits the requirement, a last review :)

@gusmith gusmith requested a review from hardbyte July 2, 2019 00:48
…-pipeline

Conflicts:
	setup.py -> just the version
@gusmith gusmith merged commit ea99c5c into master Jul 4, 2019
@gusmith gusmith deleted the feature-azure-pipeline branch July 4, 2019 01:21
@hardbyte hardbyte added this to the clkhash 0.14.0 milestone Jul 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants