New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Port coordinates module #1414
Port coordinates module #1414
Conversation
@@ -16,7 +16,7 @@ | |||
from MDAnalysisTests.coordinates.base import (BaseReaderTest, BaseReference, | |||
BaseWriterTest) | |||
from MDAnalysisTests import tempdir, make_Universe | |||
|
|||
from numpy.testing import TestCase |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is numpy.testing.TestCase
the same as unittest.TestCase
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like it - https://github.com/numpy/numpy/blob/master/numpy/testing/__init__.py#L10
Should be able to just use unittest.TestCase
in its place.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pleas use unittest.TestCase consistently then.
It's interesting that the minimal build passes but the full build fails. From pytest's point of view, both should be the same. |
This should fix the error, but I expect coverage to drop. |
@utkbansal you should be able to see the combined coverage report off-line on your laptop as well. A good idea is also the use a coverage report (html best) of just the coordinates module from the develop branch and use that as a reference. Doing all of these tests locally will be much faster then waiting for travis. |
Something fishy going on with coveralls. 🤔 |
what do your offline tests show for the coverage? |
My system shows a drop in coverage, which I was expecting. The problem with
coveralls is that it is passing the check even though the coverage has
dropped.
…On Thu, Jun 22, 2017 at 1:15 PM, Max Linke ***@***.***> wrote:
what do your offline tests show for the coverage?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1414 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AH9W-MeIE1o8Zd4XIt2xM88SJZhA6RY1ks5sGhu3gaJpZM4OAz0q>
.
|
Can you fix the coverage drop locally so you have the same pytest coverage as your nose reference? |
Yes, I'm working on it. But I would feel better if coveralls worked.
…On Thu, Jun 22, 2017 at 1:46 PM, Max Linke ***@***.***> wrote:
Can you fix the coverage drop locally so you have the same pytest coverage
as your nose reference?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1414 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AH9W-Nqu4eGME1JkJtc1xW7VCTkwWb1Oks5sGiLdgaJpZM4OAz0q>
.
|
That is not the first time coveralls has such hiccups. I'm not sure what is the reason for this. But we can still use the numbers to check for large drops in coverage. |
Avoid "Partially Fixes" in PR descriptions: github does not know about "partially" and closes the referenced issue. |
@richardjgowers @jbarnoud @kain88-de Woah, this wasn't supposed to be merged yet! |
@utkbansal Do you think it hurts that it is merged? From what I see I do not think so, but maybe I am missing something. If not, let's just keep this here and you can continue your changes from there. In the future, prefix the title of your PR with "[WIP]" to make clear it is not ready. If you can add label to PRs, you can also attach the "Work in progress" label. It is a bright right tag that should warn us that we should proceed with caution. |
@jbarnoud There is a major drop in coverage! I don't have permission to add tags but will append WIP to the title of all future PRs. |
Ah sorry, it had green lights. Open an issue for the drop in coverage |
As discussed, this is some ancient work and might involve a messy rebase.
This was the status back then:
PyTest - 1408 passed, 68 skipped
Nose - Ran 1675 tests in 172.808s OK (SKIP=64)
Partially Fixes #884
Changes made in this Pull Request:
PR Checklist