-
Notifications
You must be signed in to change notification settings - Fork 845
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run tests on sdist tarballs with stack sdist
and stack upload
#717
Comments
Note: I'm thinking that building and testing the sdists is the correct default. While it's a change in behavior, we should default to best-practices, and have a flag to turn this behavior. This would also make things like #1568 reasonable, as it's more likely that the sdist will actually be tested against its full range of deps. |
I agree. When writing a library sdist is the only thing one cares about. Application authors might get annoyed at having to keep their cabal files up-to-date, but I think correctness is more important than convenience here. Let me know if you start working on this, I'd be happy to help test it or help with some of the work. In the meantime perhaps the travis-with-caching script should be updated to test the sdists? If I'm reading it right it doesn't right now. |
FYI I just found a nixpkgs problem this would have prevented: NixOS/nixpkgs#15935 Though come to think of it I've written tests before that required checking out a git submodule before being run (also breaking on nixpkgs) so it's easy to do. |
I agree this is a good thing to implement, but it may be quite a while until I'd get around to implementing this, because while it is something I think stack should do, it is not something that frequently causes us direct difficulties in practice. The primary reason for sdists to be wrong is missing modules and TH reified files. Stack's warnings about this help a lot with this issue! So, feel free to implement this, it should be pretty straightforward. It ought to generate a random dir name, and unpack it to that dir, under |
How about generating the haddocks in addition to running the tests? Knowing that the haddocks are valid is something that I'd want to know before pushing my package to hackage. It might even be good to open the haddocks so the user can look them over. At least he should see a link to the package's index page. |
Great point! Yes, that should be part of the set of things to check. |
Unfortunately I'm not going to get around to this any time soon, so it's up for grabs.😧 If I do start on it I'll post here first so we don't do duplicate work. |
A related issue that suggests a testing step for I guess I would like |
Ideas from #1350:
|
stack sdist
and stack upload
Closing this out due to implementation in #3187. It does not support build parameters, but perhaps that's ok? Things usually ought to test successfully without needing to add options. |
Currently, in order to test that all the necessary files are in a source distribution tarball, you need to manually try building it. This feature has precedent in cabal-src and possibly elsewhere.
I'm thinking something like
stack sdist --test
to both generate and test the tarballs. It'd also be nice to havestack upload --test
, which would test the tarballs, and upload if they pass.The first step is to extract the tarballs into a directory, and the next step is to build + test the code with the current project's stack.yaml configuration.
I'm not sure what the ideal implementation of this would look like. One possibility would be to have the extraction directory mimic the structure of the mega-repo, and then copy the stack.yaml into it. This would be rather costly in rebuild time and disk space, though.
For my own testing, I've just modified my
stack.yaml
to instead point at the extracted directories. This suggests an implementation approach wheretest
is run, but with portions of the packages list substituted. This approach's downside is that it stomps on your old local install root. I don't see this as a very large downside.The text was updated successfully, but these errors were encountered: