New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port coordinates module #1414

Merged
merged 5 commits into from Jun 24, 2017

Conversation

Projects
None yet
5 participants
@utkbansal
Member

utkbansal commented Jun 21, 2017

As discussed, this is some ancient work and might involve a messy rebase.

This was the status back then:
PyTest - 1408 passed, 68 skipped
Nose - Ran 1675 tests in 172.808s OK (SKIP=64)

Partially Fixes #884

Changes made in this Pull Request:

PR Checklist

  • Tests?
  • Docs?
  • CHANGELOG updated?
  • Issue raised/referenced?
@@ -16,7 +16,7 @@
from MDAnalysisTests.coordinates.base import (BaseReaderTest, BaseReference,
BaseWriterTest)
from MDAnalysisTests import tempdir, make_Universe
from numpy.testing import TestCase

This comment has been minimized.

@richardjgowers

richardjgowers Jun 21, 2017

Member

is numpy.testing.TestCase the same as unittest.TestCase?

This comment has been minimized.

@utkbansal

utkbansal Jun 21, 2017

Member

Looks like it - https://github.com/numpy/numpy/blob/master/numpy/testing/__init__.py#L10
Should be able to just use unittest.TestCase in its place.

This comment has been minimized.

@kain88-de

kain88-de Jun 21, 2017

Member

Pleas use unittest.TestCase consistently then.

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 21, 2017

It's interesting that the minimal build passes but the full build fails. From pytest's point of view, both should be the same.

utkbansal added some commits Feb 14, 2017

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 21, 2017

This should fix the error, but I expect coverage to drop.

@kain88-de

This comment has been minimized.

Member

kain88-de commented Jun 21, 2017

@utkbansal you should be able to see the combined coverage report off-line on your laptop as well. A good idea is also the use a coverage report (html best) of just the coordinates module from the develop branch and use that as a reference. Doing all of these tests locally will be much faster then waiting for travis.

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 22, 2017

Something fishy going on with coveralls. 🤔

@kain88-de

This comment has been minimized.

Member

kain88-de commented Jun 22, 2017

what do your offline tests show for the coverage?

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 22, 2017

@kain88-de

This comment has been minimized.

Member

kain88-de commented Jun 22, 2017

Can you fix the coverage drop locally so you have the same pytest coverage as your nose reference?

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 22, 2017

@kain88-de

This comment has been minimized.

Member

kain88-de commented Jun 22, 2017

That is not the first time coveralls has such hiccups. I'm not sure what is the reason for this. But we can still use the numbers to check for large drops in coverage.

@richardjgowers richardjgowers merged commit 01bd066 into MDAnalysis:develop Jun 24, 2017

3 checks passed

QuantifiedCode No new issues introduced.
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
coverage/coveralls First build on develop at 88.157%
Details
@jbarnoud

This comment has been minimized.

Contributor

jbarnoud commented Jun 24, 2017

Avoid "Partially Fixes" in PR descriptions: github does not know about "partially" and closes the referenced issue.

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 24, 2017

@richardjgowers @jbarnoud @kain88-de Woah, this wasn't supposed to be merged yet!

@jbarnoud

This comment has been minimized.

Contributor

jbarnoud commented Jun 24, 2017

@utkbansal Do you think it hurts that it is merged? From what I see I do not think so, but maybe I am missing something. If not, let's just keep this here and you can continue your changes from there.

In the future, prefix the title of your PR with "[WIP]" to make clear it is not ready. If you can add label to PRs, you can also attach the "Work in progress" label. It is a bright right tag that should warn us that we should proceed with caution.

@utkbansal

This comment has been minimized.

Member

utkbansal commented Jun 24, 2017

@jbarnoud There is a major drop in coverage! I don't have permission to add tags but will append WIP to the title of all future PRs.

@utkbansal utkbansal referenced this pull request Jun 24, 2017

Merged

Run the pytest testsuite in parallel #1418

0 of 4 tasks complete
@richardjgowers

This comment has been minimized.

Member

richardjgowers commented Jun 24, 2017

Ah sorry, it had green lights. Open an issue for the drop in coverage

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment