Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Official document (PEP?) your tests are based on? #96

Closed
buhtz opened this issue Feb 10, 2023 · 2 comments
Closed

Official document (PEP?) your tests are based on? #96

buhtz opened this issue Feb 10, 2023 · 2 comments
Labels

Comments

@buhtz
Copy link

buhtz commented Feb 10, 2023

Is there an official document your tests are based on?
Or are this tests your "opinion"?

Let me give you an example.

I got the message The classifiers should specify what Python versions you support. or The classifiers should specify what minor versions of Python you support as well as what major version..

IMHO using classifiers to specificy the supported version is not a good idea and redundant.

[project]
# ...
requires-python = ">=3.8"
# ...
classifiers = [
    "Programming Language :: Python :: 3.8"
# ...

You see there are two places where I can specify the Python version. And pyroma does make them mandatory. I have to use both fields to make pyroma happy. 😃

I also asked on setuptools about an official document.

References:

@regebro
Copy link
Owner

regebro commented Feb 10, 2023

No, there is no official document, just best practices. Yes, indeed you need to use both. One just specified a minimum, the other one is useful to specify which versions are officially supported.

Also, "happy" is up to you. There is no requirement to get full marks, it's entirely up to you if you want to implement pyroma recommendations or not. :-D

@CAM-Gerlach
Copy link
Collaborator

Like @regebro said, Pyroma has little to do with "mandatory requirements", and rather with generally-agreed but inherently at least somewhat opinionated best practices. You can set a specific score at which Pyroma will pass or fail (exit 0 or 1) with the -n/--min CLI argument, and #69 is open to allow disabling specific checks; PRs welcome. And if you just want to check your pyproject.toml metadata and configuration against the specifications in the relevant PEPs (517, 518, 621, etc), then validate-pyproject may fit your needs better; backends like Setuptools run it automatically when building your project, and as discussed in #94 , we could add it as a preliminary check here before running the backend, to provide more informative output on user errors like the one in that issue.

Also, in regards to this specific issue, classifiers are particularly important to specify the upper bound that the package is actually tested to work against, without being a hard cap for the solver like requires_python (which causes a bunch of problems in practice). See this discussion on the official Packaging Discourse for more information and context there.

I also pypa/setuptools#3816 about an official document.

You might want to ask in the official Python Packaging Discourse instead, since this is spec/usage advice and not anything specific to the implementation in any one particular backend (like Setuptools).

@CAM-Gerlach CAM-Gerlach closed this as not planned Won't fix, can't repro, duplicate, stale Feb 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants