-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
0015 - Code stability, versioning #15
Conversation
My suggestions:
Note that a "release" as I'm describing does not necessarily have to mean a GitHub release. Or rather, there could potentially be many subsidiary github releases on whatever schedule maintainers want. But the "MP Software Stack Releases" would be predictable and specify particular versions. |
Would a "minimum release cadence" be also be useful, e.g. if a release is requested by a member of the community, and there has not been a release in X months, then the Foundation is required to ensure that a release happens? (even if this is a pre-release etc.) |
I really like the idea of a meta-package for newer users, that contains a known-compatible set of recommended codes. |
The meta-package sounds really like a great idea! I will try to update the PR accordingly. (This time without any typos and homophones.) |
While I can see some attractiveness in the metapackage, I would say it is largely not useful. The main reason is that a properly specified package would already cover dependencies. E.g., if someone installs atomate, the pymatgen version would already be specified and installed. My own packages only defines the highest level dependencies necessary. E.g., if I specify pymatgen as a dependency, I don't bother to specify numpy, monty, etc. This is the reason why there is no meta-package for even something like the scientific python stack. If you install scipy, numpy is by default installed. If you install pandas, all the subdependencies are defined. Where this is not the case, the bloat is too large to be useful. E.g., anaconda installs all kinds of packages by default, but since I use a fraction of them, I install miniconda. I don't think the "user-friendliness" aspect is worth control over package bloat. |
I wonder if it would be useful to audit our codes and make sure they all follow this practice? For example, |
I agree it is a useful principle. I would leave it to the respective package maintainers to adopt this if they want. In general, pmg and monty usually don't conflict. So it is not that big a deal. In the early days, it was more problematic because there were some incompatible numpy versions that would wreak havoc if you specified both pmg and numpy as dependencies and the versions were incompatible. This has not happened for a number of years... |
An example that a metapackage might have prevented is here To me the key "feature" of a metapackage needs to be more extensive integration testing (e.g., testing a full atomate2 workflow with the specific versions of emmet and maggma, etc.) that goes beyond what currently happens in individual packages. I suspect there's away this could be largely automated (e.g., set up a github action that periodically runs our expanded integration tests using the latest versions of all relevant codes). This would let us spot issues and pin the metapackage to the most recent working versions pretty easily. |
Yes, I think another benefit of a meta package would be knowing thr latest compatible version numbers! |
We might also want to think about a recommendation on how fast major software bugs are solved and a new stable version is released. Or how we can ensure that such solutions are provided rather fast. |
Suggested next action: prototype a meta package in a repo, possibly with a upload to test PyPI. |
Sorry, messed something up here 😬. Will clean up tomorrow |
Summary
I am wondering if we could come up with rules, ideas to make our codes more stable for outside users? E.g., rules about breaking changes, release frequency and similar.
I have left out the proposal part so far as I have no clear idea on solutions just yet and I would love to discuss this.