Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NDTiff storage to the build automation #546

Open
henrypinkard opened this issue Feb 9, 2023 · 5 comments
Open

Add NDTiff storage to the build automation #546

henrypinkard opened this issue Feb 9, 2023 · 5 comments
Labels
enhancement New feature or request

Comments

@henrypinkard
Copy link
Member

Much like the java dependencies trigger updates to pycromanager version, the updating the version of NDTiff should automatically trigger a new version of pycro-manager on pypi. This likely requires expansion of the github action in NDTiffStorage and a new script in pycromanager that updates requirements.txt and setup.py. Most of this can be copied from the actions for the Java deps, I think

Linking a specific version of ndtiff to one of pycromanager is probably a good idea for reproducibility anyway

@henrypinkard henrypinkard added the enhancement New feature or request label Feb 9, 2023
@ieivanov
Copy link
Collaborator

ieivanov commented Feb 9, 2023

We list ndtiff>=1.8.0 as a requirement, so pipy will always install the latest version of ndtiff. Should we instead pin the ndtiff version? I think the current way of managing the ndtiff version works until there is a breaking change in the ndtiff API, at which point the developer would have to manually update the requirements within pycromanager.

Managing python packages seems to be easier than managing the Java dependencies because of the imposed link between the package versions and the dependency version in java.

@henrypinkard
Copy link
Member Author

Yeah I think you're right, it works currently, but to me pinning seems better in general, because then it is always guaranteed that pycromanager a.b.c comes with ndtiff x.y.z. So if there's a problem with a Dataset, knowing the pycromanager version is sufficient to trace the problem.

I think you can similarly specify minimum versions in maven, so managing versions is essentially the same in Java or python. You can also tell maven to grab the latest version of a dependency, but as @marktsuchida has taught me, this is generally a bad idea for reproducibiliy

@ieivanov
Copy link
Collaborator

ieivanov commented Feb 9, 2023

It does make sense, though as far as I know that's not the standard practice in the python community. I'm looking at the napari dependencies, for example: https://github.com/napari/napari/blob/main/setup.cfg

@henrypinkard
Copy link
Member Author

henrypinkard commented Feb 9, 2023

I think the difference there may be that most of those are 3rd party dependencies. In that case, you want to be permissive on the range, because other things are installed that may also depend on different versions of those dependencies, and you want to let pip/conda have maximum flexibility in finding one that works for them all.

I we could probably just follow that convention and stick to pinning only a minimum version. I guess its possible to imagine a situation where you need a newer NDTiff, but don't want to upgrade pycro-manager because of API changes that need to be synced with MM

@ieivanov
Copy link
Collaborator

ieivanov commented Feb 9, 2023

Ah, yes, that makes sense. OK, I agree!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants