-
-
Notifications
You must be signed in to change notification settings - Fork 7.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PEP8 testing #1945
PEP8 testing #1945
Conversation
# local directory which is not in the repository) by adding a | ||
# ".pep8_test_exclude.txt" file in the same directory as this test. | ||
# The file should be a line separated list of filenames/directories | ||
# as can be passed to the "pep8" tool's exclude list. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this comment above just be a docstring? Also, it seems out of date with what's actually here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think its out of date (I suppose there is more going on in the method than the comment suggests?). Is there a specific part which is out of date?
If I convert it to a docstring, you end up with annoying output when running the tests. This can be avoided though (see http://stackoverflow.com/questions/12962772/how-to-stop-python-unittest-from-printing-test-docstring)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see. Let's leave it as a comment then.
I was confused by the list of exclusions in this file and the mention of .pep8_test_exclude
. Why are there two different methods for defining exclusions?
I think this is a great idea. My only worry is that some PEP8 warning categories can and should be reasonably broken in certain contexts (some of the whitespace ones for example). I wonder if there's any warning categories that if we turned off, more files would become compliant...? |
@mdboom We can filter warnings from PEP8. I would suggest filtering the following (which are a bit strict): I think the most important is not to check whether the file is pep8, but more whether the file is more and more pep8, but I'm not sure how we could do this automatically in the tests (on scikit-learn, we have a jenkins tests for this, which is just here to plot how much warnings we have - the goal is to have less and less warnings. It would be the same with test coverage: we want it to increase). |
Maybe we could add a pyflakes check also? I've spotted and fixed a lot of small bugs thanks to it. |
@NelleV: I agree that tracking PEP8 compliance would be useful. There's sort of two use cases here: 1) to make sure new code in a PR is PEP8 compliant by having the Travis build fail, 2) to just track that PEP8 compliance is getting better. I plan to have a ShiningPanda (Jenkins) instance up soon for matplotlib (we just got the payment sent), so we can should definitely run these kinds of graphs there. As for |
@mdboom I don't know about pyflakes, but we can turn this off when using flake8 (pyflakes + pep8) the same way can filter pep8 warnings out. |
This is achievable if we kept a log of the pep8 output and ran a diff each time we run the tests (this could be automated). The only problem with that is that we would need to update the results every time we make a minor PEP8 fix... Given how much work you've done @NelleV I was hopeful we could start to rattle off some of the files on the list pretty quickly. Once they are pep8-d (and tested), they should stay pep8-d with this test... |
The error E225 (missing whitespace around operator) should also be blacklisted. The operator expression examples in PEP8 were just plain wrong for quite some time and many people and PEP8 check-tools picked up that error. I recently filed a bug (http://bugs.python.org/issue16239) and the examples were changed, but I guess it will take some time until this change is reflected in checking-tools. |
It has already been fixed. |
Ah sorry, the pep8 package in Ubuntu 13.04 was outdated of course. Pep8 1.4.5 indeed doesn't complain about that anymore. |
Thanks for your comments. @mdboom - I think this is a case of Merge or Close - I'm fine either way, but I don't think there is much point this sitting here if collectively we aren't keen to have this in the codebase. |
I'm 👍 on the idea. At the moment, there are some PEP8 failures, and we should fix those before merging this. Can you clarify the comment to address my question from before? The only concern I have is that there appear to be two different ways to define exclusions and I'm not sure why one would choose one over the other. |
Hmmm. They weren't failing on my machine. I'll investigate.
The important list - the modules which are enumerated in the test itself are obviously the modules which are not compliant. The second list of exclusions comes from a file which is not under version control (you the user has control of what goes in there) - this is for files which are in your clone, but not in the repository itself. I'm not adverse to removing the second exclusions - it was useful to me when I wrote a similar test for cartopy, but I'm not currently using it in matplotlib... |
Travis tests the PR branch merged with the destination branch. So in this case, I wouldn't be surprised if there's something on master that broke PEP8 conformance. Try rebasing your branch on the current master and tetsing that. |
Turns out that when you install a package with setuptools which has shared objects as an egg, you get extra "wrapper"
I'm going to put in a patch to setuptools to make the output PEP8 compliant. |
Ok. In the meantime, I suspect we can just ignore the bootstrap files. EDIT: By that, I mean have this system ignore them. |
Its a bit harder than that. Generally we don't want to list files in the exclusions if they are not part of the repository. I've submitted the PR https://bitbucket.org/tarek/distribute/pull-request/40 and will now work on the workaround... |
I guess I just meant that if there's a way that we could automatically detect that something is a bootstrap file, we wouldn't PEP8 check it. I'm not sure I follow what the objection to that is. Even if your patch is accepted upstream, it will be a while until it trickles down and everyone has the fix to distribute (and distribute is soon to be replaced, so it could be quite some time). |
…e on all files except those explicitly excluded.
I've just rebased again. pep8 testing on python3 is a no-go - 2to3 is producing some non-compliant output. Hopefully this will be the last commit... |
The tests are passing on python2.6, and an unrelated failure has occured on the python2.7 build. I think this PR is good to go. |
I've logged the failure in #2046. |
@mdboom - once this is merged, I'm planning to reduce the number of files which are explicitly excluded by fixing up some of the more simple PEP8 problems. |
I'll help on that too on my (almost non existing) free time. |
To be fair @NelleV you've already done all of the hard work - I'd just be sweeping up any remaining issues that have been introduced since your work, and having the satisfaction of knowing that the code will continue to remain compliant once the filename is struck off the list 😄 |
Bump. Sorry to pester @mdboom but this is one I don't want to go stale... |
Looks good, now that all tests are passing. |
Added a coding_standards test module. Currently checks PEP8 compliance on all files except those explicitly excluded.
@NelleV's work on getting PEP8 compliance has been really valuable - this test makes sure that that work is being carried on by all of us in our development efforts.