Skip to content

Conversation

@DimitriPapadopoulos
Copy link

While it would be best to match Python suppression comments and CodeQL
suppression comments, in the short term this will avoid false positives.

Takes into account pycodestyle error codes (both E and W):
https://pycodestyle.pycqa.org/en/latest/intro.html#error-codes

Does not take into account flake8/mccabe errors codes (F and C):
https://flake8.pycqa.org/en/latest/user/error-codes.html

Fixes #6517.

@DimitriPapadopoulos DimitriPapadopoulos requested a review from a team as a code owner August 22, 2021 05:59
While it would be best to match Python suppression comments and CodeQL
suppression comments, in the short term this will avoid false positives.

Takes into account pycodestyle error codes (both E and W):
https://pycodestyle.pycqa.org/en/latest/intro.html#error-codes

Takes into account flake8 errors codes (F) but not mccabe (C):
https://flake8.pycqa.org/en/latest/user/error-codes.html
@tausbn
Copy link
Contributor

tausbn commented Aug 25, 2021

Thank you for your submission.

Unfortunately, we cannot accept this PR, as it would allow a relatively low severity noqa rule to mask a higher severity problem detected by LGTM. This is the reason we only accept the generic form of noqa.

For your use case, would it not be sufficient to add lgtm (if necessary delimited with a ;) to the lines in question? This would have the same effect as the changes in your PR, but would make it clear and explicit that all LGTM alerts are excluded on that line.

@DimitriPapadopoulos
Copy link
Author

DimitriPapadopoulos commented Aug 25, 2021

Adding different suppression comments for each linter is not a solution: one suppression comment for LGTM.com, one for codespell, one for flake8, one for Coverity, and so on... We'll end up with mores suppression comments than actual code.

The standard for Python is noqa, you need to support it.

I need to stress that because of the lack of decent support of noqa in LGTM.com, programmers will be tempted to use noqa to please LGTM.com, and as you write, this "may mask a higher severity problem detected by LGTM" and other linters. At least, with my patch, flake8 can still detect the higher severity problem.

In effect, in this specific context, the effect of LGTM.com is lower security.

@tausbn
Copy link
Contributor

tausbn commented Aug 25, 2021

I can understand your frustration. It would be great if there was a universally agreed-upon scheme for specifying suppression comments. (Though I would personally prefer one that conveyed a bit more information than just a number that one would then have to go look up elsewhere.)

In that case, my best suggestion to you would be to change the existing noqa:XXX to noqa, or to remove all LGTM alerts by specifying this in an appropriate lgtm.yml file.

I understand that this is likely not the answer you're looking for, but what you're suggesting (though only a minor change) has a fairly big impact on how alert suppression works. Other users may be relying on the fact that specific noqa annotations do not hide results of the LGTM analysis.

@DimitriPapadopoulos
Copy link
Author

DimitriPapadopoulos commented Aug 26, 2021

I have changed the relevant code to noqa.

However, I'm not certain you fully understand the effect of this change. To use your own words, "what you're suggesting (though only a minor change) has a fairly big impact on how alert suppression works":

  • Again, to use your own words, this may "mask a higher severity problem detected by LGTM".
  • Even worse, it may also mask a higher severity problem that could have been detected by Flake8. Making use of your own words, "users may be relying on the fact that the LGTM analysis requirements do not hide specific noqa annotations".

On one hand, we'd like to address LGTM alerts, because LGTM push to become some sort of bug authority, and projects don't want to be pointed at - especially in case of false positives. On the other hand, fixing LGTM alerts by changing noqa: F417 to a mere noqa does lower code quality. No frustration here, just facts: in this (very specific) case, LGTM does lower quality standards.

I'll look into the lgtm.yml solution. Thank you for mentioning it, although I suspect it may be too coarse-grained.

@DimitriPapadopoulos
Copy link
Author

DimitriPapadopoulos commented Aug 26, 2021

@DimitriPapadopoulos
Copy link
Author

I can understand why you cannot accept the patch, your priority is not the efficiency of the whole ecosystem around Python and overall Python code quality, your priority is the precision of LGTM alerts by themselves. These two targets do not necessarily align, and they don't have to. It is up to end users to find solutions to work around false positives, without sacrificing code quality.

Perhaps this discussion should have taken place in issue #6517, which should remain open until noqa support can be improved, not in this PR.

The problem of the lone noqa not taken into account remains, see above comment. Should I open a new issue about it?

@yoff
Copy link
Contributor

yoff commented Aug 30, 2021

The comment not taken into account mentioned above is easy to fix, at least. One way to do it is here: #6570

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LGTM.com - false positive - 'noqa' suppression comments

3 participants