Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pytest-flake8 #3945

Merged
merged 6 commits into from Nov 7, 2019
Merged

Add pytest-flake8 #3945

merged 6 commits into from Nov 7, 2019

Conversation

noviluni
Copy link
Member

@noviluni noviluni commented Aug 7, 2019

proposal for: #3944

Please, take in mind that it’s only a PoC. In this way the testing time increases and maybe it’s not necessary to check flake8 against all environments.

@codecov-io
Copy link

Codecov Report

Merging #3945 into master will decrease coverage by 0.55%.
The diff coverage is n/a.

@@            Coverage Diff            @@
##           master   #3945      +/-   ##
=========================================
- Coverage   84.96%   84.4%   -0.56%     
=========================================
  Files         166     166              
  Lines        9681    9681              
  Branches     1445    1392      -53     
=========================================
- Hits         8225    8171      -54     
- Misses       1194    1233      +39     
- Partials      262     277      +15
Impacted Files Coverage Δ
scrapy/link.py 86.36% <0%> (-13.64%) ⬇️
scrapy/utils/gz.py 86.48% <0%> (-13.52%) ⬇️
scrapy/_monkeypatches.py 63.63% <0%> (-9.1%) ⬇️
scrapy/mail.py 70.23% <0%> (-5.96%) ⬇️
scrapy/utils/reqser.py 88.23% <0%> (-5.89%) ⬇️
scrapy/robotstxt.py 91.8% <0%> (-4.92%) ⬇️
scrapy/utils/iterators.py 93.47% <0%> (-4.35%) ⬇️
scrapy/item.py 94.2% <0%> (-4.35%) ⬇️
scrapy/downloadermiddlewares/httpproxy.py 96.22% <0%> (-3.78%) ⬇️
scrapy/downloadermiddlewares/decompression.py 96.61% <0%> (-3.39%) ⬇️
... and 10 more

@codecov-io
Copy link

codecov-io commented Aug 9, 2019

Codecov Report

Merging #3945 into master will increase coverage by 0.01%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master    #3945      +/-   ##
==========================================
+ Coverage   83.36%   83.37%   +0.01%     
==========================================
  Files         165      165              
  Lines        9802     9802              
  Branches     1462     1462              
==========================================
+ Hits         8171     8172       +1     
  Misses       1366     1366              
+ Partials      265      264       -1
Impacted Files Coverage Δ
scrapy/utils/trackref.py 86.48% <0%> (+2.7%) ⬆️

@noviluni noviluni force-pushed the add_pytest_flake8 branch 3 times, most recently from 6a2aac0 to 697168a Compare August 9, 2019 18:18
@noviluni noviluni marked this pull request as ready for review August 9, 2019 18:37
@noviluni
Copy link
Member Author

noviluni commented Aug 9, 2019

Seems that the pipeline worked: https://travis-ci.org/scrapy/scrapy/jobs/569955789

@Gallaecio
Copy link
Member

@noviluni Looks great. However be ready for something like #3727 (comment) from @dangra :)

@Gallaecio Gallaecio mentioned this pull request Aug 12, 2019
@noviluni
Copy link
Member Author

Hi @Gallaecio thank you for your feedback. I've been thinking that, and I'm not totally agree (in fact, I can't fully understand the reasons)... The idea here is to avoid big PRs to be reviewed and adding noqa marks to every line/function/file implies a lot of changes :/

As my original proposal was to avoid that, for me it has no sense. I can't see the benefits of adding all those lines and then delete them when fixing the code. It involves more work (twice) and probably more "risk" (looking at your PR: 170 files to review is kind of risky).

@dangra what do you think? Is it necessary to mark all the cases one by one and then start fixing them or could we start this way and when fixing every flake8 rule add the necessary noqa to those "exceptions" we want to keep?

Thanks in advance

@noviluni
Copy link
Member Author

noviluni commented Sep 27, 2019

Updated to use a list files (with the excluded rules) instead of ignoring all rules directly. In that way it's possible to disable checks only for the files that are not passing them.

Copy link
Member

@Gallaecio Gallaecio left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’ve left a couple of comments, but I’m OK with merging this as is.

.travis.yml Outdated Show resolved Hide resolved
pytest.ini Show resolved Hide resolved
@noviluni
Copy link
Member Author

noviluni commented Sep 30, 2019

I’ve left a couple of comments, but I’m OK with merging this as is.

thank you for your feedback 😄

@Gallaecio Gallaecio merged commit e8b1e46 into scrapy:master Nov 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants