Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stack does not respect lower bound specified in the cabal file. #4495

Closed
recursion-ninja opened this issue Jan 8, 2019 · 11 comments
Closed
Milestone

Comments

@recursion-ninja
Copy link

recursion-ninja commented Jan 8, 2019

General summary/comments

I had added the following line to our .cabal file after we fixed the vector library to be compatible with compact regions so we could add back vector parallelism:

  -- >=0.12.0.2 required for compatibility with "compact regions"
  , vector             >=0.12.0.2

Turns out stack ignores this very explicitly defined lower bound. You have to hold its hand and also add this to the stack.yaml file:

extra-deps:                                                                                             
  - vector-0.12.0.2

This is a deficient solution, because now we will always build with vector-0.12.0.2, even if there is a newer version available. stack should attempt to retrieve a package satisfying the version constraints from hackage if one cannot be found in the resolver list.

Steps to reproduce

See this commit for the defect:
amnh/PCG@beb9ec0#diff-0d579eb41129e06212ecf283405e96a0

Just stack build and run any dataset with multiple characters.

Expected

Use the a version of vector that satisfies the version constraints specified in the .cabal file, if one exists (it does).

Actual

Used a different version of the vector library which is incompatible with compact and throws run-time exceptions, making our software useless and wasting days of my time to diagnose.

Stack version

$ stack --version
Version 1.9.1, Git revision f9d0042c141660e1d38f797e1d426be4a99b2a3c (6168 commits) x86_64 hpack-0.31.0

Method of installation

  • Official binary, downloaded from stackage.org or fpcomplete's package repository
@dbaynard
Copy link
Contributor

dbaynard commented Jan 8, 2019

Hi @recursion-ninja, you've got allow-newer: true set in your stack.yaml file. Despite its name it ignores lower bounds, in addition to upper bounds, matching cabal-install behaviour.

If you delete that you'll still need the extra-deps entry, else you'll get a compilation failure. The snapshot lts-12.13 provides vector-0.12.0.1. In that case, though a compilation failure is what you want, correct?

If you're concerned about the extra-dep, I hope I can reassure you. It only affects you as a developer — users of your library and executable will not be restricted when new versions of vector are released — and when you upgrade (for development) to a newer stackage snapshot which contains a suitable version of vector, you can delete the extra-dep. You could consider that now, actually, and upgrade to lts-13, though that snapshot uses ghc 8.6.

Feel free to reopen if that doesn't resolve your issue, and by all means ask any further questions here.

@dbaynard dbaynard closed this as completed Jan 8, 2019
@dbaynard dbaynard added this to the Support milestone Jan 8, 2019
@recursion-ninja
Copy link
Author

recursion-ninja commented Jan 8, 2019

In the .cabal file there exists two kinds of bound constraints. There are lower bounds, and there are upper bounds. I know that the Haskell community has the PVP versioning system that is to some degree adhered to and this requires upper and lower bounds on package dependencies. Here's the thing though, upper and lower bounds constraints are different!

  • A lower bound on a dependency exists because the package developer knows that the code does not work with older versions of the dependency.

  • An upper bound on a dependency exists because a developer speculatively assumes that the code doesn't work with a newer (major) version of the dependency.

The flag allow-newer should only allow newer versions of a dependency to subvert the speculative assumption(s) that newer dependencies will not work.

The flag allow-newer should never allow older versions of a dependency and violate the certain knowledge that earlier dependencies do not work.

I really would have thought that this is straight forward logic, but apparently it needs to be articulated in an issue. It violates the principle of least astonishment!

This is an exceptional defect for stack with two solutions:

  • Keep allow-newer, change it's functionality, and make another flag called allow-older for allowing older packages.
  • Rename allow-newer to allow-any-version.

@recursion-ninja
Copy link
Author

recursion-ninja commented Jan 8, 2019

Shall we reopen this issue or should I create a new, more aptly name issue:
"allow-newer actually also allows older"?

@dbaynard
Copy link
Contributor

dbaynard commented Jan 9, 2019

I'll open a new one. It's a design change, not a bug.

From the documentation:

### allow-newer
(Since 0.1.7)
Ignore version bounds in .cabal files. Default is false.
```yaml
allow-newer: true
```
Note that this also ignores lower bounds. The name "allow-newer" is chosen to
match the commonly used cabal option.

Would you be interested in contributing changes?

@dbaynard
Copy link
Contributor

dbaynard commented Jan 9, 2019

stack should attempt to retrieve a package satisfying the version constraints from hackage if one cannot be found in the resolver list.

You can use stack solver for this — but you will need to add the corresponding versions to your stack.yaml file.

To be clear — allow-newer: true is an escape hatch for when a package maintainer doesn't update their bounds. And it is not clear when a lower bound is not permitted due to incompatibility, vs just not tested. This is not a stack issue but a combination of an ambiguity in the Cabal format with the social factors of software design.

You could introduce a small section of template haskell which would run some code (that should fails on 0.12.0.1) during compilation, to ensure that if somebody does use allow-newer you always fail to compile.

@recursion-ninja
Copy link
Author

recursion-ninja commented Jan 9, 2019

I understand that allow-newer is an "escape-hatch" for when dependency maintainers do not update their bounds. But it should only be an escape hatch for upper bounds. Lower bounds do exist for real, incompatibility reasons. Upper bounds rarely do in my experience. The use case for this "escape hatch" is almost solely for relaxing upper bounds. It should be designed as such and a different "escape hatch" added for the much, much rarer (I've never witnessed it) case where lower bounds need to be ignored for an older version of the package.

I am most certainly not interested in contributing changes. I only want to use a tool which does not repeatedly violate the principle of least astonishment.

@recursion-ninja
Copy link
Author

Suppose I write a library function like so:

-- | Returns the cube of a value
square :: Int -> Int
square x = x * x * x

Then suppose a user of my library a user points out that the function returns the cube of the input and not the square. I would not direct them to the haddock documentation which clearly specifies the unintuitive functionality. I would not classify correcting the issue as a "design change" instead of a "bug."

square is clearly a misnamed function, just as allow-newer is clearly a misnamed option. I would rename square to cube and add an new square function to fulfill the user's requirements. I would not suggest that the user make these corrections for me. I hope stack can be as reasonable.

Can you create an issue to prioritize fixing this?

@recursion-ninja
Copy link
Author

recursion-ninja commented Sep 6, 2019

I must insist that is issue is in fact valid and needs to be re-opened since a new issue has not been created to address this defect. Especially since cabal decided to handle the allow-newer flag sensibly.

According to the cabal-3.0 documentation, allow-newer only allows the the bounds checker to relax the upper bounds and search for newer package versions. However stack-2.1.3 does not do this when the allow-newer flag is set to true.

caba-3.0 also supports the allow-older flag for relaxing lower bounds. stack should maintain feature parity with cabal on these most simple of use cases.

@brianhuffman
Copy link

I came here to open a ticket about the misleadingly-named allow-newer: true option, which caused me quite a bit of confusion earlier today (see the discussion at GaloisInc/parameterized-utils#70).

But I see that @recursion-ninja has already clearly and thoroughly described the exact problem that I have. I can only reiterate the same points: allow-newer is misnamed; either its behavior should be changed to match its name or it should be renamed to allow-newer-or-older; it violates the principle of least surprise; the fact that its odd behavior is documented is no excuse.

I don't understand why this ticket is closed.

@dbaynard: You said on January 8, 2019 that you'd open a new issue, but I can't find it. If you don't want to reopen this issue, then could you please add a link to the new issue from this thread?

@recursion-ninja
Copy link
Author

@brianhuffman, if I have learned anything over the last 5 year of dealing with stack and it's maintainers, it is that any effort towards improved user experience is a sisyphean task, no matter how logical or straightforward.

My wholehearted recommendation is to abandon stack and use cabal exclusively.

@brianhuffman
Copy link

@recursion-ninja, thanks for the advice. We're already using cabal v2-build on a bunch of other projects; maybe now's the time to transition the rest of them. My co-workers and I appreciate your valiant attempt at arguing your case and fighting the good fight here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants