Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

Adapt ipython tools for testing pull requests #752

Merged
merged 11 commits into from

5 participants

@jtorrents
Owner

@ipython has a very nice set of tools for their github based development workflow. See:

https://github.com/ipython/ipython/tree/master/tools

As @fperez suggested at SciPy 2012, it is easy and useful to adapt some of them for NetworkX development. This pull request adapts the test_pr.py script and friends for NetworkX. It automatically tests pull requests (using virtualenv and downloading the code directly from github) and optionally posts the results as a comment in the pull request page (including a link to a gist with the log for failed tests). It is a very nice and useful tool. For the origunal ipython pr_test.py see:

https://github.com/ipython/ipython/blob/master/tools/test_pr.py

I did some changes on how the tests are run inside each virualenv. IPython uses a script for running tests which is installed at the 'bin' directory of each virtialenv. But I think it is easier for NetworkX tests to simply use check_output:

py = os.path.join(basedir, venv, 'bin', 'python')
cmd = [py, '-c', 'import networkx as nx; nx.test(verbosity=2,doctest=True)']
log = check_output(cmd, stderr=STDOUT).decode('utf-8')

I also added PyPy support (last commit). But I'm not sure if it will work in something different than a Debian based system. PyPy and virtualenv play nice together and no significative change in the code is required, but in Debian/Ubuntu systems the option --system-site-packages of virtialenv does not add the actual dist-packages path for pypy (/usr/local/lib/pypy2.7/dist-packages). As a quick and dirty solution, I've opted for adding the path explicitly in run_tests() to be able to import nose, but it is not a solid solution. So more work is needed. I'll be quite busy the next days and I'm not sure if I will be able to work on this, so I'm posting it hoping than someone can take a look, test it in other systems and also work on it.

This script assumes that there is a directory named .nx_pr_tests in your home directory and that the nose package and other libraries are installed system wide in each python environment. It does not import anything from the user defined $PYTHONPATH.

@jtorrents
Owner

Test results for commit 773c75b merged into master (3fc2ba9)
Platform: linux2

  • python2.6: OK (SKIP=57) Ran 1525 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: OK (SKIP=2) Ran 1707 tests (libraries not available: ogr)
  • python3.2: OK (SKIP=10) Ran 1678 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
  • pypy: OK (SKIP=57) Ran 1525 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
@hagberg hagberg was assigned
@fperez

Very happy to see you guys using our tools! Let us know if anything needs clarification, and conversely, please send any upates/improvements our way :)

@Carreau

Note that we use gh_api.py which could maybe be merged with pygithub to navigate github programmatically through python.
There is also hub that adds a lot of github magics to git, but in Ruby.

@jtorrents
Owner

@fperez One of the things that I like of test_pr and friends is their simplicity, by reading the code you get a clear idea of what is going on. When our adaptation is merged and has settled down I will put together a pull request with all the changes that could also benefit ipython's test_pr.

@Carreau we used pygithub for the migration trac -> github. It is under active development and has a lot of functionality, but is still not well documented and is rather big and complex compared with gh_api.py. For now I think that it is better to keep gh_api.py instead of adding an external dependency.

In the future might make sense to create a project to abstract and generalize this kind of script and create a proper standalone program (if I recall correctly @Carreau suggested that at scipy2012). Other projects, such as @sympy have the sympy-bot for testing pull requests which creates very nice reviews. I still like test_pr.py better because it is small, simple and gets the job done.

Thanks for test_pr.py and friends @ipython guys!

@hagberg
Owner

The test_pr.py script isn't working for me - maybe I'm missing something?

[aric@ll tools (tools)]$ python test_pr.py 742
Traceback (most recent call last):
  File "test_pr.py", line 340, in <module>
    test_pr(args.number, post_results=args.publish)
  File "test_pr.py", line 312, in test_pr
    testrun = TestRun(num)
  File "test_pr.py", line 77, in __init__
    self.pr = gh_api.get_pull_request(gh_project, pr_num)
  File "/home/aric/Software/hagberg-networkx/tools/gh_api.py", line 95, in get_pull_request
    return json.loads(response.text, object_hook=Obj)
AttributeError: 'Response' object has no attribute 'text'
@takluyver

@hagberg: I think you need a newer version of requests.

@hagberg
Owner

OK - I figured it out. My Ubuntu installation had an old/different version of the "requests" package.
I installed the latest one from here and it works: http://docs.python-requests.org/en/latest/index.html

But I don't have a PYTHONPATH set and it breaks looking for that:

Traceback (most recent call last):
  File "test_pr.py", line 340, in <module>
    test_pr(args.number, post_results=args.publish)
  File "test_pr.py", line 316, in test_pr
    testrun.run()
  File "test_pr.py", line 231, in run
    passed, log = run_tests(venv)
  File "test_pr.py", line 267, in run_tests
    orig_pythonpath = os.environ["PYTHONPATH"]
  File "/usr/lib/python2.7/UserDict.py", line 23, in __getitem__
    raise KeyError(key)
KeyError: 'PYTHONPATH'

These are small things but we should update the script to work (or break gracefully) in these cases. I have a stock Ubuntu 12.04 install.

@jtorrents
Owner

Yes, I forgot to mention that. The python-requests package in Ubuntu 12.04 is outdated, you need to install a newer version. Not sure which one, I installed it from its github repo. I'll try to figure out the minimum request version needed for test_pr.py

@takluyver
@jtorrents
Owner

I added the PYTHONPATH bug when trying to support pypy in Ubuntu. The commit above fixes it. But it is still not clear to me who is to blame for not adding the correct dist-packages path when creating a pypy virtualenv. I'll look at that, but will not have a lot of time the next few days. Maybe an option is to remove pypy support for now, and add it when we have figured it out.

I also added a check for requests version and if our requirements are not met the script exits with an informative message.

@hagberg
Owner
@jtorrents
Owner

Test results for commit 8088d2e merged into master (3fc2ba9)
Platform: linux2

  • python2.6: OK (SKIP=57) Ran 1525 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: OK (SKIP=2) Ran 1707 tests (libraries not available: ogr)
  • python3.2: OK (SKIP=10) Ran 1678 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
@fperez
@Carreau

Note that this might make some of this work easier,

https://gist.github.com/3342247

and moreover, instead of fetching refs/pull/*/head you can fetch refs/pull/*/merge wich is already the current state merge into master.

That's actually how github does test how This pull request can be automatically merged., by trying to merge, and see if it fails. Though, I don't know what /merge point to when merge don't work, but github api should tell you that's it's not mergeable.

@jtorrents
Owner

@Carreau, that is very nice. I didn't know you could do that. It is very useful for easily fetching pull requests. I still don't have a good understanding of git, so I don't have an informed opinion of the changes that you propose. But I will port any changes that you guys do on ipython's test_pr.py. However I think that the summary of the script should be more explicit if the pull request is not mergeable, maybe asking explicitly to rebase from master (as the sympy-bot does). If I'm not mistaken, this can be done with the current implementation, it is only a matter of changing the summary output. I'll try to do that.

@Carreau

I still don't have a good understanding of git, so I don't have an informed opinion of the changes that you propose.

I dont know git well neither, I discovered it by mistake by mistyping what was in the gist.

I think that you could
Ask github api : Is it mergeable
if so
clone /pull/#pr/merge and do it with a --depth of 1 as you'll need only the more recent commit.
then run the test.

It should decrease the required bandwidth a lot.

@jtorrents
Owner

NetworkX: Test results for pull request #752
Branch tools from @jtorrents at https://github.com/jtorrents/networkx.git
:eight_spoked_asterisk: This pull request can be merged clearly (commit 804e903 into NetworkX master 50cb114)
Platform: linux2

  • python2.6: :eight_spoked_asterisk: OK (SKIP=57) Ran 1526 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: :eight_spoked_asterisk: OK (SKIP=2) Ran 1708 tests (libraries not available: ogr)
  • python3.2: :eight_spoked_asterisk: OK (SKIP=10) Ran 1679 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
@jtorrents
Owner

Finally I've found some time to work on this. I've improved the reports from tests in order to be clearer, they caused some confusion (see #762). As Chris (@chebee7i) noted, posting tests results that pass as github comments (as I just did above) can give a false sense of security. We could adopt the convention of only posting the result of the script if there is some failure (test_pr.py will post a trace of the nose output in a gist). Thus we avoid the false sense of security by only posting its result to signal to the author of the pull request that there is a problem that needs fixing.

@takluyver

I think it's quite important to post on success, so that everyone can see that the PR was tested before it was merged. Scanning through #762, it looks like the confusion just relates to the wording 'merged into master', so maybe that can be adjusted. Then again, I don't think we've had the same confusion in IPython, so maybe it was a one off.

@jtorrents
Owner
@takluyver

In fact, I think your wording is already clearer in the report above: "This pull request can be merged clearly..." (although I think it should say 'cleanly', not clearly). I like the little icons - I assume there's a different icon for a failure? I'm not so sure about the branch description line, it seems to duplicate information already in the PR, although I know there's not a link to the repository. Maybe it should be integrated into the title: "...pull request #752 (jtorrents 'tools' branch)"

@jtorrents
Owner
@jtorrents
Owner

NetworkX: Test results for pull request #752 (jtorrents 'tools' branch)
:red_circle: This pull request cannot be merged cleanly (commit 430d53f into NetworkX master 50cb114). Please rebase from upstream/master
Platform: linux2

  • python2.6: :eight_spoked_asterisk: OK (SKIP=57) Ran 1525 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: :eight_spoked_asterisk: OK (SKIP=2) Ran 1707 tests (libraries not available: ogr)
  • python3.2: :eight_spoked_asterisk: OK (SKIP=10) Ran 1678 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
@jtorrents
Owner

NetworkX: Test results for pull request #752 (jtorrents 'tools' branch)
:eight_spoked_asterisk: This pull request can be merged cleanly (commit 430d53f into NetworkX master 50cb114)
Platform: linux2

  • python2.6: :eight_spoked_asterisk: OK (SKIP=57) Ran 1526 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: :eight_spoked_asterisk: OK (SKIP=2) Ran 1708 tests (libraries not available: ogr)
  • python3.2: :eight_spoked_asterisk: OK (SKIP=10) Ran 1679 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
@jtorrents
Owner

Implemented the changes suggested by Thomas. I also posted a message of a non mergeable pull request. It asks explicitly to rebase form upstream/master. Let me know if something should be changed/improved.

@takluyver

Doesn't git default to calling the clone source 'origin', rather than 'upstream'? I guess that doesn't work if you cloned from your own Github fork, though. Can the rebase command use a remote repository URL?

@jtorrents
Owner

NetworkX: Test results for pull request #752 (jtorrents 'tools' branch)
:eight_spoked_asterisk: This pull request can be merged cleanly (commit 854868c into NetworkX master 50cb114)
Platform: linux2

  • python2.6: :eight_spoked_asterisk: OK (SKIP=57) Ran 1526 tests (libraries not available: pyyaml, pydot, numpy, matplotlib, ogr, yaml, scipy, pyparsing, pygraphviz)
  • python2.7: :eight_spoked_asterisk: OK (SKIP=2) Ran 1708 tests (libraries not available: ogr)
  • python3.2: :eight_spoked_asterisk: OK (SKIP=10) Ran 1679 tests (libraries not available: pyparsing, ogr, pygraphviz, matplotlib, pydot)
@jtorrents
Owner
@fperez
@jtorrents
Owner
@fperez
@hagberg hagberg merged commit ef11bff into networkx:master
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
This page is out of date. Refresh to see the latest.
Showing with 569 additions and 0 deletions.
  1. +201 −0 tools/gh_api.py
  2. +13 −0 tools/post_pr_test.py
  3. +355 −0 tools/test_pr.py
View
201 tools/gh_api.py
@@ -0,0 +1,201 @@
+"""Functions for Github authorisation."""
+from __future__ import print_function
+
+try:
+ input = raw_input
+except NameError:
+ pass
+
+import os
+
+import requests
+import getpass
+import json
+
+# Keyring stores passwords by a 'username', but we're not storing a username and
+# password
+fake_username = 'networkx-tests'
+
+class Obj(dict):
+ """Dictionary with attribute access to names."""
+ def __getattr__(self, name):
+ try:
+ return self[name]
+ except KeyError:
+ raise AttributeError(name)
+
+ def __setattr__(self, name, val):
+ self[name] = val
+
+token = None
+def get_auth_token():
+ global token
+
+ if token is not None:
+ return token
+
+ import keyring
+ token = keyring.get_password('github', fake_username)
+ if token is not None:
+ return token
+
+ print("Please enter your github username and password. These are not "
+ "stored, only used to get an oAuth token. You can revoke this at "
+ "any time on Github.")
+ user = input("Username: ")
+ pw = getpass.getpass("Password: ")
+
+ auth_request = {
+ "scopes": [
+ "public_repo",
+ "gist"
+ ],
+ "note": "NetworkX tools",
+ "note_url": "https://github.com/networkx/networkx/tree/master/tools",
+ }
+ response = requests.post('https://api.github.com/authorizations',
+ auth=(user, pw), data=json.dumps(auth_request))
+ response.raise_for_status()
+ token = json.loads(response.text)['token']
+ keyring.set_password('github', fake_username, token)
+ return token
+
+def make_auth_header():
+ return {'Authorization': 'token ' + get_auth_token()}
+
+def post_issue_comment(project, num, body):
+ url = 'https://api.github.com/repos/{project}/issues/{num}/comments'.format(project=project, num=num)
+ payload = json.dumps({'body': body})
+ r = requests.post(url, data=payload, headers=make_auth_header())
+
+def post_gist(content, description='', filename='file', auth=False):
+ """Post some text to a Gist, and return the URL."""
+ post_data = json.dumps({
+ "description": description,
+ "public": True,
+ "files": {
+ filename: {
+ "content": content
+ }
+ }
+ }).encode('utf-8')
+
+ headers = make_auth_header() if auth else {}
+ response = requests.post("https://api.github.com/gists", data=post_data, headers=headers)
+ response.raise_for_status()
+ response_data = json.loads(response.text)
+ return response_data['html_url']
+
+def get_pull_request(project, num):
+ """get pull request info by number
+ """
+ url = "https://api.github.com/repos/{project}/pulls/{num}".format(project=project, num=num)
+ response = requests.get(url)
+ response.raise_for_status()
+ return json.loads(response.text, object_hook=Obj)
+
+def get_pulls_list(project):
+ """get pull request list
+ """
+ url = "https://api.github.com/repos/{project}/pulls".format(project=project)
+ response = requests.get(url)
+ response.raise_for_status()
+ return json.loads(response.text)
+
+# encode_multipart_formdata is from urllib3.filepost
+# The only change is to iter_fields, to enforce S3's required key ordering
+
+def iter_fields(fields):
+ fields = fields.copy()
+ for key in ('key', 'acl', 'Filename', 'success_action_status', 'AWSAccessKeyId',
+ 'Policy', 'Signature', 'Content-Type', 'file'):
+ yield (key, fields.pop(key))
+ for (k,v) in fields.items():
+ yield k,v
+
+def encode_multipart_formdata(fields, boundary=None):
+ """
+ Encode a dictionary of ``fields`` using the multipart/form-data mime format.
+
+ :param fields:
+ Dictionary of fields or list of (key, value) field tuples. The key is
+ treated as the field name, and the value as the body of the form-data
+ bytes. If the value is a tuple of two elements, then the first element
+ is treated as the filename of the form-data section.
+
+ Field names and filenames must be unicode.
+
+ :param boundary:
+ If not specified, then a random boundary will be generated using
+ :func:`mimetools.choose_boundary`.
+ """
+ # copy requests imports in here:
+ from io import BytesIO
+ from requests.packages.urllib3.filepost import (
+ choose_boundary, six, writer, b, get_content_type
+ )
+ body = BytesIO()
+ if boundary is None:
+ boundary = choose_boundary()
+
+ for fieldname, value in iter_fields(fields):
+ body.write(b('--%s\r\n' % (boundary)))
+
+ if isinstance(value, tuple):
+ filename, data = value
+ writer(body).write('Content-Disposition: form-data; name="%s"; '
+ 'filename="%s"\r\n' % (fieldname, filename))
+ body.write(b('Content-Type: %s\r\n\r\n' %
+ (get_content_type(filename))))
+ else:
+ data = value
+ writer(body).write('Content-Disposition: form-data; name="%s"\r\n'
+ % (fieldname))
+ body.write(b'Content-Type: text/plain\r\n\r\n')
+
+ if isinstance(data, int):
+ data = str(data) # Backwards compatibility
+ if isinstance(data, six.text_type):
+ writer(body).write(data)
+ else:
+ body.write(data)
+
+ body.write(b'\r\n')
+
+ body.write(b('--%s--\r\n' % (boundary)))
+
+ content_type = b('multipart/form-data; boundary=%s' % boundary)
+
+ return body.getvalue(), content_type
+
+
+def post_download(project, filename, name=None, description=""):
+ """Upload a file to the GitHub downloads area"""
+ if name is None:
+ name = os.path.basename(filename)
+ with open(filename, 'rb') as f:
+ filedata = f.read()
+
+ url = "https://api.github.com/repos/{project}/downloads".format(project=project)
+
+ payload = json.dumps(dict(name=name, size=len(filedata),
+ description=description))
+ response = requests.post(url, data=payload, headers=make_auth_header())
+ response.raise_for_status()
+ reply = json.loads(response.content)
+ s3_url = reply['s3_url']
+
+ fields = dict(
+ key=reply['path'],
+ acl=reply['acl'],
+ success_action_status=201,
+ Filename=reply['name'],
+ AWSAccessKeyId=reply['accesskeyid'],
+ Policy=reply['policy'],
+ Signature=reply['signature'],
+ file=(reply['name'], filedata),
+ )
+ fields['Content-Type'] = reply['mime_type']
+ data, content_type = encode_multipart_formdata(fields)
+ s3r = requests.post(s3_url, data=data, headers={'Content-Type': content_type})
+ return s3r
View
13 tools/post_pr_test.py
@@ -0,0 +1,13 @@
+#!/usr/bin/env python
+"""Post the results of a pull request test to Github.
+"""
+from test_pr import TestRun
+
+testrun = TestRun.load_results()
+testrun.post_logs()
+testrun.print_results()
+testrun.post_results_comment()
+
+print()
+print("Posted test results to pull request")
+print(" " + testrun.pr['html_url'])
View
355 tools/test_pr.py
@@ -0,0 +1,355 @@
+#!/usr/bin/env python
+"""
+This is a script for testing pull requests for NetworkX. It merges the pull
+request with current master, installs and tests on all available versions of
+Python, and posts the results to Gist if any tests fail.
+
+This script is heavily based on IPython's test_pr.py and friends. See:
+
+http://github.com/ipython/ipython/tree/master/tools
+
+Usage:
+ python test_pr.py 742
+"""
+from __future__ import print_function
+
+import errno
+from glob import glob
+import io
+import json
+import os
+import pickle
+import re
+import requests
+import shutil
+import time
+from subprocess import call, check_call, check_output, PIPE, STDOUT, CalledProcessError
+import sys
+
+import gh_api
+from gh_api import Obj
+
+basedir = os.path.join(os.path.expanduser("~"), ".nx_pr_tests")
+repodir = os.path.join(basedir, "networkx")
+nx_repository = 'git://github.com/networkx/networkx.git'
+nx_http_repository = 'http://github.com/networkx/networkx.git'
+gh_project="networkx/networkx"
+
+# TODO Add PyPy support
+supported_pythons = ['python2.6', 'python2.7', 'python3.2']
+
+# Report missing libraries during tests and number of skipped
+# and passed tests.
+missing_libs_re = re.compile('SKIP: (\w+) not available')
+def get_missing_libraries(log):
+ libs = set()
+ for line in log.split('\n'):
+ m = missing_libs_re.search(line)
+ if m:
+ libs.add(m.group(1).lower())
+ if libs:
+ return ", ".join(libs)
+
+skipped_re = re.compile('SKIP=(\d+)')
+def get_skipped(log):
+ m = skipped_re.search(log)
+ if m:
+ return m.group(1)
+
+number_tests_re = re.compile('Ran (\d+) tests in')
+def get_number_tests(log):
+ m = number_tests_re.search(log)
+ if m:
+ return m.group(1)
+
+
+class TestRun(object):
+ def __init__(self, pr_num):
+ self.unavailable_pythons = []
+ self.venvs = []
+ self.pr_num = pr_num
+
+ self.pr = gh_api.get_pull_request(gh_project, pr_num)
+
+ self.setup()
+
+ self.results = []
+
+ def available_python_versions(self):
+ """Get the executable names of available versions of Python on the system.
+ """
+ for py in supported_pythons:
+ try:
+ check_call([py, '-c', 'import nose'], stdout=PIPE)
+ yield py
+ except (OSError, CalledProcessError):
+ self.unavailable_pythons.append(py)
+
+ def setup(self):
+ """Prepare the repository and virtualenvs."""
+ try:
+ os.mkdir(basedir)
+ except OSError as e:
+ if e.errno != errno.EEXIST:
+ raise
+ os.chdir(basedir)
+
+ # Delete virtualenvs and recreate
+ for venv in glob('venv-*'):
+ shutil.rmtree(venv)
+ for py in self.available_python_versions():
+ check_call(['virtualenv', '-p', py,
+ '--system-site-packages', 'venv-%s' % py])
+
+ self.venvs.append((py, 'venv-%s' % py))
+
+ # Check out and update the repository
+ if not os.path.exists('networkx'):
+ try :
+ check_call(['git', 'clone', nx_repository])
+ except CalledProcessError:
+ check_call(['git', 'clone', nx_http_repository])
+ os.chdir(repodir)
+ check_call(['git', 'checkout', 'master'])
+ try :
+ check_call(['git', 'pull', 'origin', 'master'])
+ except CalledProcessError:
+ check_call(['git', 'pull', nx_http_repository, 'master'])
+ self.master_sha = check_output(['git', 'log', '-1',
+ '--format=%h']).decode('ascii').strip()
+ os.chdir(basedir)
+
+ def get_branch(self):
+ repo = self.pr['head']['repo']['clone_url']
+ branch = self.pr['head']['ref']
+ owner = self.pr['head']['repo']['owner']['login']
+ mergeable = self.pr['mergeable']
+
+ os.chdir(repodir)
+ if mergeable:
+ merged_branch = "%s-%s" % (owner, branch)
+ # Delete the branch first
+ call(['git', 'branch', '-D', merged_branch])
+ check_call(['git', 'checkout', '-b', merged_branch])
+ check_call(['git', 'pull', '--no-ff', '--no-commit', repo, branch])
+ check_call(['git', 'commit', '-m', "merge %s/%s" % (repo, branch)])
+ else:
+ # Fetch the branch without merging it.
+ check_call(['git', 'fetch', repo, branch])
+ check_call(['git', 'checkout', 'FETCH_HEAD'])
+ os.chdir(basedir)
+
+ def markdown_format(self):
+ def format_result(result):
+ s = "* %s: " % result.py
+ if result.passed:
+ s += "%s OK (SKIP=%s) Ran %s tests" % \
+ (ok, result.skipped, result.num_tests)
+ else:
+ s += "%s Failed, log at %s" % (fail, result.log_url)
+ if result.missing_libraries:
+ s += " (libraries not available: " + result.missing_libraries + ")"
+ return s
+ pr_num = self.pr_num
+ branch = self.pr['head']['ref']
+ branch_url = self.pr['head']['repo']['html_url'] + '/tree/' + branch
+ owner = self.pr['head']['repo']['owner']['login']
+ mergeable = self.pr['mergeable']
+ master_sha = self.master_sha
+ branch_sha = self.pr['head']['sha'][:7]
+ ok = ':eight_spoked_asterisk:'
+ fail = ':red_circle:'
+
+ header = "**NetworkX: Test results for pull request #%s " % pr_num
+ header += "([%s '%s' branch](%s))**" % (owner, branch, branch_url)
+ if mergeable:
+ mrg = "%s This pull request can be merged cleanly " % ok
+ else:
+ mrg = "%s This pull request **cannot** be merged cleanly " % fail
+ mrg += "(commit %s into NetworkX master %s)" % (branch_sha, master_sha)
+ lines = [header,
+ mrg,
+ "Platform: " + sys.platform,
+ ""] + \
+ [format_result(r) for r in self.results]
+ if self.unavailable_pythons:
+ lines += ["",
+ "Not available for testing: " \
+ + ", ".join(self.unavailable_pythons)]
+ return "\n".join(lines)
+
+ def post_results_comment(self):
+ body = self.markdown_format()
+ gh_api.post_issue_comment(gh_project, self.pr_num, body)
+
+ def print_results(self):
+ pr_num = self.pr_num
+ branch = self.pr['head']['ref']
+ branch_url = self.pr['head']['repo']['html_url'] + '/tree/' + branch
+ owner = self.pr['head']['repo']['owner']['login']
+ mergeable = self.pr['mergeable']
+ master_sha = self.master_sha
+ branch_sha = self.pr['head']['sha'][:7]
+
+ print("\n")
+ print("**NetworkX: Test results for pull request %s " % pr_num,
+ "(%s '%s' branch at %s)**" % (owner, branch, branch_url))
+ if mergeable:
+ mrg = "OK: This pull request can be merged cleanly "
+ else:
+ mrg = "FAIL: This pull request **cannot** be merged cleanly "
+ mrg += "(commit %s into NetworkX master %s)" % (branch_sha, master_sha)
+ print(mrg)
+ print("Platform:", sys.platform)
+ for result in self.results:
+ if result.passed:
+ print(result.py, ":", "OK (SKIP=%s) Ran %s tests" % \
+ (result.skipped, result.num_tests))
+ else:
+ print(result.py, ":", "Failed")
+ print(" Test log:", result.get('log_url') or result.log_file)
+ if result.missing_libraries:
+ print(" Libraries not available:", result.missing_libraries)
+ if self.unavailable_pythons:
+ print("Not available for testing:",
+ ", ".join(self.unavailable_pythons))
+
+ def dump_results(self):
+ with open(os.path.join(basedir, 'lastresults.pkl'), 'wb') as f:
+ pickle.dump(self, f)
+
+ @staticmethod
+ def load_results():
+ with open(os.path.join(basedir, 'lastresults.pkl'), 'rb') as f:
+ return pickle.load(f)
+
+ def save_logs(self):
+ for result in self.results:
+ if not result.passed:
+ result_locn = os.path.abspath(os.path.join('venv-%s' % result.py,
+ self.pr['head']['sha'][:7]+".log"))
+ with io.open(result_locn, 'w', encoding='utf-8') as f:
+ f.write(result.log)
+
+ result.log_file = result_locn
+
+ def post_logs(self):
+ for result in self.results:
+ if not result.passed:
+ result.log_url = gh_api.post_gist(result.log,
+ description='NetworkX test log',
+ filename="results.log", auth=True)
+
+ def run(self):
+ for py, venv in self.venvs:
+ tic = time.time()
+ passed, log = run_tests(venv)
+ elapsed = int(time.time() - tic)
+ print("Ran tests with %s in %is" % (py, elapsed))
+ missing_libraries = get_missing_libraries(log)
+ skipped = get_skipped(log)
+ num_tests = get_number_tests(log)
+
+ self.results.append(Obj(py=py,
+ passed=passed,
+ log=log,
+ missing_libraries=missing_libraries,
+ skipped=skipped,
+ num_tests=num_tests
+ )
+ )
+
+
+def run_tests(venv):
+ version = venv.split('-')[1]
+ py = os.path.join(basedir, venv, 'bin', 'python')
+ os.chdir(repodir)
+ # cleanup build-dir
+ if os.path.exists('build'):
+ shutil.rmtree('build')
+ #tic = time.time()
+ print ("\nInstalling NetworkX with %s" % py)
+ logfile = os.path.join(basedir, venv, 'install.log')
+ print ("Install log at %s" % logfile)
+ with open(logfile, 'wb') as f:
+ check_call([py, 'setup.py', 'install'], stderr=STDOUT, stdout=f)
+ #toc = time.time()
+ #print ("Installed NetworkX in %.1fs" % (toc-tic))
+ os.chdir(basedir)
+
+ # Remove PYTHONPATH if present
+ os.environ.pop("PYTHONPATH", None)
+
+ # check that the right NetworkX is imported. Also catch exception if
+ # the pull request breaks "import networkx as nx"
+ try:
+ cmd_file = [py, '-c', 'import networkx as nx; print(nx.__file__)']
+ nx_file = check_output(cmd_file, stderr=STDOUT)
+ except CalledProcessError as e:
+ return False, e.output.decode('utf-8')
+
+ nx_file = nx_file.strip().decode('utf-8')
+ if not nx_file.startswith(os.path.join(basedir, venv)):
+ msg = u"NetworkX does not appear to be in the venv: %s" % nx_file
+ msg += u"\nDo you use setupegg.py develop?"
+ print(msg, file=sys.stderr)
+ return False, msg
+
+ # Run tests: this is different than in ipython's test_pr, they use
+ # a script for running their tests. It gets installed at
+ # os.path.join(basedir, venv, 'bin', 'iptest')
+ print("\nRunning tests with %s ..." % version)
+ cmd = [py, '-c', 'import networkx as nx; nx.test(verbosity=2,doctest=True)']
+ try:
+ return True, check_output(cmd, stderr=STDOUT).decode('utf-8')
+ except CalledProcessError as e:
+ return False, e.output.decode('utf-8')
+
+
+def test_pr(num, post_results=True):
+ # Get Github authorisation first, so that the user is prompted straight away
+ # if their login is needed.
+ if post_results:
+ gh_api.get_auth_token()
+
+ testrun = TestRun(num)
+
+ testrun.get_branch()
+
+ testrun.run()
+
+ testrun.dump_results()
+
+ testrun.save_logs()
+ testrun.print_results()
+
+ if post_results:
+ results_urls = testrun.post_logs()
+ testrun.post_results_comment()
+ print("(Posted to Github)")
+ else:
+ post_script = os.path.join(os.path.dirname(sys.argv[0]), "post_pr_test.py")
+ print("To post the results to Github, run", post_script)
+
+
+if __name__ == '__main__':
+ import argparse
+ parser = argparse.ArgumentParser(description="Test a pull request for NetworkX")
+ parser.add_argument('-p', '--publish', action='store_true',
+ help="Publish the results to Github")
+ parser.add_argument('number', type=int, help="The pull request number")
+
+ args = parser.parse_args()
+
+ # Test for requests version.
+ import requests
+ major, minor, rev = map(int, requests.__version__.split('.'))
+ if minor < 10:
+ print("test_pr.py:")
+ print("The requests python library must be version 0.10.0",
+ "or above, you have version",
+ "{0}.{1}.{2} installed".format(major, minor, rev))
+ print()
+ sys.exit(1)
+
+ test_pr(args.number, post_results=args.publish)
Something went wrong with that request. Please try again.