New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Solver chooses mono-traversable-0.4.0 when attempting to build stack-0.1.3.0 #2759

Open
tebello-thejane opened this Issue Aug 12, 2015 · 24 comments

Comments

Projects
None yet
7 participants
@tebello-thejane

tebello-thejane commented Aug 12, 2015

Trying to install the latest stack-0.1.3.0 causes the following failure:

Preprocessing library mono-traversable-0.4.0...
[1 of 6] Compiling Data.MonoTraversable ( src/Data/MonoTraversable.hs, dist/build/Data/MonoTraversable.o )

src/Data/MonoTraversable.hs:39:1: Warning:
    The import of ‘$, replicate’ from module ‘Prelude’ is redundant

src/Data/MonoTraversable.hs:72:1: Warning:
    Module ‘Control.Monad.Trans.Error’ is deprecated:
      Use Control.Monad.Trans.Except instead

src/Data/MonoTraversable.hs:131:24: Warning:
    In the use of type constructor or class ‘ErrorT’
    (imported from Control.Monad.Trans.Error):
    Deprecated: "Use Control.Monad.Trans.Except instead"

src/Data/MonoTraversable.hs:192:36: Warning:
    In the use of type constructor or class ‘ErrorT’
    (imported from Control.Monad.Trans.Error):
    Deprecated: "Use Control.Monad.Trans.Except instead"

src/Data/MonoTraversable.hs:324:32: Warning:
    In the use of ‘Unsafe.inlinePerformIO’
    (imported from Data.ByteString.Internal):
    Deprecated: "If you think you know what you are doing, use 'unsafePerformIO'. If you are sure you know what you are doing, use 'unsafeDupablePerformIO'. If you enjoy sharing an address space with a malevolent agent of chaos, try 'accursedUnutterablePerformIO'."

src/Data/MonoTraversable.hs:326:29: Warning:
    In the use of ‘Unsafe.inlinePerformIO’
    (imported from Data.ByteString.Internal):
    Deprecated: "If you think you know what you are doing, use 'unsafePerformIO'. If you are sure you know what you are doing, use 'unsafeDupablePerformIO'. If you enjoy sharing an address space with a malevolent agent of chaos, try 'accursedUnutterablePerformIO'."
makeCorePair: arity missing $cofoldMap_abvQ
[2 of 6] Compiling Data.GrowingAppend ( src/Data/GrowingAppend.hs, dist/build/Data/GrowingAppend.o )
[3 of 6] Compiling Data.Sequences   ( src/Data/Sequences.hs, dist/build/Data/Sequences.o )

src/Data/Sequences.hs:14:1: Warning:
    The import of ‘not’ from module ‘Prelude’ is redundant
[4 of 6] Compiling Data.MinLen      ( src/Data/MinLen.hs, dist/build/Data/MinLen.o )

src/Data/MinLen.hs:80:1:
    Couldn't match representation of type ‘f1 mono’
                             with that of ‘f1 (MinLen nat mono)’
    arising from trying to show that the representations of
      ‘(Element mono -> f1 (Element mono)) -> mono -> f1 mono’ and
      ‘(Element mono -> f1 (Element mono))
       -> MinLen nat mono -> f1 (MinLen nat mono)’ are the same
    Relevant role signatures:
      type role Element nominal
      type role MinLen phantom representational
    NB: We cannot know what roles the parameters to ‘f1’ have;
      we must assume that the role is nominal
    Relevant bindings include
      otraverse :: (Element (MinLen nat mono)
                    -> f (Element (MinLen nat mono)))
                   -> MinLen nat mono -> f (MinLen nat mono)
        (bound at src/Data/MinLen.hs:80:1)
    In the expression:
        ghc-prim-0.4.0.0:GHC.Prim.coerce
          (otraverse ::
             (Element mono -> f (Element mono)) -> mono -> f mono) ::
          forall (f :: * -> *). GHC.Base.Applicative f =>
          (Element (MinLen nat mono) -> f (Element (MinLen nat mono)))
          -> MinLen nat mono -> f (MinLen nat mono)
    In an equation for ‘otraverse’:
        otraverse
          = ghc-prim-0.4.0.0:GHC.Prim.coerce
              (otraverse ::
                 (Element mono -> f (Element mono)) -> mono -> f mono) ::
              forall (f :: * -> *). GHC.Base.Applicative f =>
              (Element (MinLen nat mono) -> f (Element (MinLen nat mono)))
              -> MinLen nat mono -> f (MinLen nat mono)
    When typechecking the code for  ‘otraverse’
      in a derived instance for ‘MonoTraversable (MinLen nat mono)’:
      To see the code I am typechecking, use -ddump-deriv
    In the instance declaration for ‘MonoTraversable (MinLen nat mono)’

src/Data/MinLen.hs:80:1:
    Couldn't match representation of type ‘m1 mono’
                             with that of ‘m1 (MinLen nat mono)’
    arising from trying to show that the representations of
      ‘(Element mono -> m1 (Element mono)) -> mono -> m1 mono’ and
      ‘(Element mono -> m1 (Element mono))
       -> MinLen nat mono -> m1 (MinLen nat mono)’ are the same
    Relevant role signatures:
      type role Element nominal
      type role MinLen phantom representational
    NB: We cannot know what roles the parameters to ‘m1’ have;
      we must assume that the role is nominal
    Relevant bindings include
      omapM :: (Element (MinLen nat mono)
                -> m (Element (MinLen nat mono)))
               -> MinLen nat mono -> m (MinLen nat mono)
        (bound at src/Data/MinLen.hs:80:1)
    In the expression:
        ghc-prim-0.4.0.0:GHC.Prim.coerce
          (omapM :: (Element mono -> m (Element mono)) -> mono -> m mono) ::
          forall (m :: * -> *). GHC.Base.Monad m =>
          (Element (MinLen nat mono) -> m (Element (MinLen nat mono)))
          -> MinLen nat mono -> m (MinLen nat mono)
    In an equation for ‘omapM’:
        omapM
          = ghc-prim-0.4.0.0:GHC.Prim.coerce
              (omapM :: (Element mono -> m (Element mono)) -> mono -> m mono) ::
              forall (m :: * -> *). GHC.Base.Monad m =>
              (Element (MinLen nat mono) -> m (Element (MinLen nat mono)))
              -> MinLen nat mono -> m (MinLen nat mono)
    When typechecking the code for  ‘omapM’
      in a derived instance for ‘MonoTraversable (MinLen nat mono)’:
      To see the code I am typechecking, use -ddump-deriv
    In the instance declaration for ‘MonoTraversable (MinLen nat mono)’
cabal: Error: some packages failed to install:
chunked-data-0.2.0 depends on mono-traversable-0.4.0 which failed to install.
conduit-combinators-1.0.2 depends on mono-traversable-0.4.0 which failed to
install.
mono-traversable-0.4.0 failed during the building phase. The exception was:
ExitFailure 1
stack-0.1.3.0 depends on mono-traversable-0.4.0 which failed to install.

However, specifying a constraint manually via cabal install --constraint 'mono-traversable >= 0.9' stack succeeds.

So, this is not an issue with stack, since it builds just fine when cabal is given a nudge.

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

So, this is not an issue with stack, since it builds just fine when cabal is given a nudge.

Wrong. There is an issue in stack (and its dependencies): mono-traversable-0.4.0 has an upper bound on base "<5", while it does not compile with ghc>=7.8.*

Nonetheless i agree it looks strange that cabal chooses that version of mono-traversable. The behaviour can be observed on the install-plan for chunked-data-0.2.0. The diff of the plans without/with a manual constraint on mono-traversable is

> diff -y --suppress-common-lines <(cabal install --dry-run chunked-data) <(cabal install --dry-run chunked-data --constraint="mono-traversable>0.4.0")
                                         > dlist-0.7.1.1
vector-0.11.0.0                          | dlist-instances-0.1
                                         > vector-0.10.12.3 (latest: 0.11.0.0)
                                         > mwc-random-0.13.3.2
                                         > vector-algorithms-0.7
mono-traversable-0.4.0 (latest: 0.9.2.1) | mono-traversable-0.9.2.1

It appears that cabal has to choose between non-latest vector and non-latest mono-traversable. And indeed:

> cabal install --dry-run chunked-data-0.2.0 --constraint="mono-traversable>0.4.0" --constraint="vector==0.11.0.0"
Resolving dependencies...
cabal: Could not resolve dependencies:
trying: chunked-data-0.2.0 (user goal)
trying: vector-0.11.0.0 (dependency of chunked-data-0.2.0)
trying: mono-traversable-0.9.2.1 (dependency of chunked-data-0.2.0)
next goal: vector-algorithms (dependency of mono-traversable-0.9.2.1)
rejecting: vector-algorithms-0.7, 0.6.0.4, 0.6.0.3, 0.6.0.2, 0.6.0.1
(conflict: vector==0.11.0.0, vector-algorithms => vector>=0.6 && <0.11)
rejecting: vector-algorithms-0.5.4.2, 0.5.4.1, 0.5.4, 0.5.3.1, 0.5.3, 0.5.2,
0.5.1, 0.5.0, 0.4, 0.3.4, 0.3.3, 0.3.2, 0.3.1, 0.3 (conflict: mono-traversable
=> vector-algorithms>=0.6)
Dependency tree exhaustively searched.

It seems to me that cabal works correctly. Otherwise I would expect some reasoning about what is supposedly done wrong.

Also:
In my opinion the kind of analysis i did above should be done before reporting an "issue". People complain about (and, with what could be seen an arrogance, quickly blame) cabal, while simultaneously making such bare-bone problem reports. Does not seem fair for a free, open-source project.

Just a rant, please take it as such. It is a response the communities' attitude towards cabal in general, not this particular report.

Collaborator

lspitzner commented Aug 12, 2015

So, this is not an issue with stack, since it builds just fine when cabal is given a nudge.

Wrong. There is an issue in stack (and its dependencies): mono-traversable-0.4.0 has an upper bound on base "<5", while it does not compile with ghc>=7.8.*

Nonetheless i agree it looks strange that cabal chooses that version of mono-traversable. The behaviour can be observed on the install-plan for chunked-data-0.2.0. The diff of the plans without/with a manual constraint on mono-traversable is

> diff -y --suppress-common-lines <(cabal install --dry-run chunked-data) <(cabal install --dry-run chunked-data --constraint="mono-traversable>0.4.0")
                                         > dlist-0.7.1.1
vector-0.11.0.0                          | dlist-instances-0.1
                                         > vector-0.10.12.3 (latest: 0.11.0.0)
                                         > mwc-random-0.13.3.2
                                         > vector-algorithms-0.7
mono-traversable-0.4.0 (latest: 0.9.2.1) | mono-traversable-0.9.2.1

It appears that cabal has to choose between non-latest vector and non-latest mono-traversable. And indeed:

> cabal install --dry-run chunked-data-0.2.0 --constraint="mono-traversable>0.4.0" --constraint="vector==0.11.0.0"
Resolving dependencies...
cabal: Could not resolve dependencies:
trying: chunked-data-0.2.0 (user goal)
trying: vector-0.11.0.0 (dependency of chunked-data-0.2.0)
trying: mono-traversable-0.9.2.1 (dependency of chunked-data-0.2.0)
next goal: vector-algorithms (dependency of mono-traversable-0.9.2.1)
rejecting: vector-algorithms-0.7, 0.6.0.4, 0.6.0.3, 0.6.0.2, 0.6.0.1
(conflict: vector==0.11.0.0, vector-algorithms => vector>=0.6 && <0.11)
rejecting: vector-algorithms-0.5.4.2, 0.5.4.1, 0.5.4, 0.5.3.1, 0.5.3, 0.5.2,
0.5.1, 0.5.0, 0.4, 0.3.4, 0.3.3, 0.3.2, 0.3.1, 0.3 (conflict: mono-traversable
=> vector-algorithms>=0.6)
Dependency tree exhaustively searched.

It seems to me that cabal works correctly. Otherwise I would expect some reasoning about what is supposedly done wrong.

Also:
In my opinion the kind of analysis i did above should be done before reporting an "issue". People complain about (and, with what could be seen an arrogance, quickly blame) cabal, while simultaneously making such bare-bone problem reports. Does not seem fair for a free, open-source project.

Just a rant, please take it as such. It is a response the communities' attitude towards cabal in general, not this particular report.

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

@snoyberg maybe the best practical solution would be to x-revision the bound on base to <ghc-7.10 for mono-traversable<=0.4.0 ? (not that i am a big fan of x-revisions).

Collaborator

lspitzner commented Aug 12, 2015

@snoyberg maybe the best practical solution would be to x-revision the bound on base to <ghc-7.10 for mono-traversable<=0.4.0 ? (not that i am a big fan of x-revisions).

@snoyberg

This comment has been minimized.

Show comment
Hide comment
@snoyberg

snoyberg Aug 12, 2015

Collaborator

Ah, the vector-algorithms restrictive upper bound is indeed the problem, thanks for figuring that out:

$ cabal install --dry-run --package-db=clear --package-db=global chunked-data --constraint 'vector >= 0.11' --constraint 'mono-traversable == 0.9.2.1'
Resolving dependencies...
cabal: Could not resolve dependencies:
trying: chunked-data-0.2.0 (user goal)
trying: vector-0.11.0.0 (dependency of chunked-data-0.2.0)
trying: mono-traversable-0.9.2.1 (dependency of chunked-data-0.2.0)
next goal: vector-algorithms (dependency of mono-traversable-0.9.2.1)
rejecting: vector-algorithms-0.7, 0.6.0.4, 0.6.0.3, 0.6.0.2, 0.6.0.1
(conflict: vector==0.11.0.0, vector-algorithms => vector>=0.6 && <0.11)
rejecting: vector-algorithms-0.5.4.2, 0.5.4.1, 0.5.4, 0.5.3.1, 0.5.3, 0.5.2,
0.5.1, 0.5.0, 0.4, 0.3.4, 0.3.3, 0.3.2, 0.3.1, 0.3 (conflict: mono-traversable
=> vector-algorithms>=0.6)
Dependency tree exhaustively searched.
Collaborator

snoyberg commented Aug 12, 2015

Ah, the vector-algorithms restrictive upper bound is indeed the problem, thanks for figuring that out:

$ cabal install --dry-run --package-db=clear --package-db=global chunked-data --constraint 'vector >= 0.11' --constraint 'mono-traversable == 0.9.2.1'
Resolving dependencies...
cabal: Could not resolve dependencies:
trying: chunked-data-0.2.0 (user goal)
trying: vector-0.11.0.0 (dependency of chunked-data-0.2.0)
trying: mono-traversable-0.9.2.1 (dependency of chunked-data-0.2.0)
next goal: vector-algorithms (dependency of mono-traversable-0.9.2.1)
rejecting: vector-algorithms-0.7, 0.6.0.4, 0.6.0.3, 0.6.0.2, 0.6.0.1
(conflict: vector==0.11.0.0, vector-algorithms => vector>=0.6 && <0.11)
rejecting: vector-algorithms-0.5.4.2, 0.5.4.1, 0.5.4, 0.5.3.1, 0.5.3, 0.5.2,
0.5.1, 0.5.0, 0.4, 0.3.4, 0.3.3, 0.3.2, 0.3.1, 0.3 (conflict: mono-traversable
=> vector-algorithms>=0.6)
Dependency tree exhaustively searched.
@snoyberg

This comment has been minimized.

Show comment
Hide comment
@snoyberg

snoyberg Aug 12, 2015

Collaborator

Revisions published. I can't really test this though since the problem is non-deterministic (or at least appears to be, since my cabal install commands initially worked).

Collaborator

snoyberg commented Aug 12, 2015

Revisions published. I can't really test this though since the problem is non-deterministic (or at least appears to be, since my cabal install commands initially worked).

@tebello-thejane

This comment has been minimized.

Show comment
Hide comment
@tebello-thejane

tebello-thejane Aug 12, 2015

I gave a bug report, with what I did, what I expected to happen, and what happened instead. Expecting more than that from end-users is unreasonable, as it assumes that only people with a certain level of expertise should be allowed to alert the developers when funny shit happens.

I did spend an hour or two trying to track down the culprit. cabal-graphdeps unfortunately runs out of memory on such a large project, and I don't have the experience to figure out what cabal --dry-run -v is trying to say. That's already more than one should expect from end-users with wildly varying levels of expertise.

Not everything that is wrong is due to Cabal. In fact, were it not for Cabal and Hackage then Haskell would probably still be quite successfully "avoiding success". So thanks.

tebello-thejane commented Aug 12, 2015

I gave a bug report, with what I did, what I expected to happen, and what happened instead. Expecting more than that from end-users is unreasonable, as it assumes that only people with a certain level of expertise should be allowed to alert the developers when funny shit happens.

I did spend an hour or two trying to track down the culprit. cabal-graphdeps unfortunately runs out of memory on such a large project, and I don't have the experience to figure out what cabal --dry-run -v is trying to say. That's already more than one should expect from end-users with wildly varying levels of expertise.

Not everything that is wrong is due to Cabal. In fact, were it not for Cabal and Hackage then Haskell would probably still be quite successfully "avoiding success". So thanks.

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

When i now do cabal install --dry-run chunked-data, it chooses mono-traversable-0.3.1 unless i have vector-0.10.12.3 installed in the sandbox. Even with the --package-db=clear --package-db=global. This kind of resolver cleverness is horrible :/

Collaborator

lspitzner commented Aug 12, 2015

When i now do cabal install --dry-run chunked-data, it chooses mono-traversable-0.3.1 unless i have vector-0.10.12.3 installed in the sandbox. Even with the --package-db=clear --package-db=global. This kind of resolver cleverness is horrible :/

@snoyberg

This comment has been minimized.

Show comment
Hide comment
@snoyberg

snoyberg Aug 12, 2015

Collaborator

How about now? I added some preferred version info and a lower bound in chunked-data.

Collaborator

snoyberg commented Aug 12, 2015

How about now? I added some preferred version info and a lower bound in chunked-data.

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

@tebello-thejane I agree with your criticism of what i seem to expect of end-users; that is asked too much. But then you should admit your limited knowledge and ask where the problem might lie, not make a bold claim "So, this is not an issue with stack [..]". There is a big difference in tone between "there is an error in this tool" and "i wish this command succeeded; please help".

Also i suggest you mention how much time you invested into trying to diagnose the issue.

I am aware that it was @snoyberg who first claimed a fault in cabal :)

Collaborator

lspitzner commented Aug 12, 2015

@tebello-thejane I agree with your criticism of what i seem to expect of end-users; that is asked too much. But then you should admit your limited knowledge and ask where the problem might lie, not make a bold claim "So, this is not an issue with stack [..]". There is a big difference in tone between "there is an error in this tool" and "i wish this command succeeded; please help".

Also i suggest you mention how much time you invested into trying to diagnose the issue.

I am aware that it was @snoyberg who first claimed a fault in cabal :)

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

@snoyberg seems all fine now :)
(stack seemed to resolve to the right plan even before the latest fix; should have mentioned. my point was about the sandbox being a potential cause for the non-determinism).

Collaborator

lspitzner commented Aug 12, 2015

@snoyberg seems all fine now :)
(stack seemed to resolve to the right plan even before the latest fix; should have mentioned. my point was about the sandbox being a potential cause for the non-determinism).

@23Skidoo

This comment has been minimized.

Show comment
Hide comment
@23Skidoo

23Skidoo Aug 12, 2015

Member

Looks like it wasn't a solver bug after all, so closing.

@lspitzner Thanks for diagnosing the cause of the error.

Member

23Skidoo commented Aug 12, 2015

Looks like it wasn't a solver bug after all, so closing.

@lspitzner Thanks for diagnosing the cause of the error.

@23Skidoo 23Skidoo closed this Aug 12, 2015

@snoyberg

This comment has been minimized.

Show comment
Hide comment
@snoyberg

snoyberg Aug 12, 2015

Collaborator

I'm not going to go much deeper on this... but equating "technically speaking this is doing something that can be justified even though no one would ever want it to behave this way" with "not a bug" has been a large source of frustration with cabal. The fact that the dependency solver chose an ancient version of mono-traversable instead of a recent version of vector, when release dates would provide clear indication that the other way around would be more likely to succeed, is the kind of technically-correct-yet-practically-wrong thing that causes users grief constantly.

Collaborator

snoyberg commented Aug 12, 2015

I'm not going to go much deeper on this... but equating "technically speaking this is doing something that can be justified even though no one would ever want it to behave this way" with "not a bug" has been a large source of frustration with cabal. The fact that the dependency solver chose an ancient version of mono-traversable instead of a recent version of vector, when release dates would provide clear indication that the other way around would be more likely to succeed, is the kind of technically-correct-yet-practically-wrong thing that causes users grief constantly.

@23Skidoo

This comment has been minimized.

Show comment
Hide comment
@23Skidoo

23Skidoo Aug 12, 2015

Member

@snoyberg You have a point. Not sure, though, if release dates info is currently available to cabal-install.
@kosmikus, do you think something can be done in the solver to prevent such errors in the future?

Member

23Skidoo commented Aug 12, 2015

@snoyberg You have a point. Not sure, though, if release dates info is currently available to cabal-install.
@kosmikus, do you think something can be done in the solver to prevent such errors in the future?

@ttuegel

This comment has been minimized.

Show comment
Hide comment
@ttuegel

ttuegel Aug 12, 2015

Member

@23Skidoo I've investigated that before, and yes, release dates are available in the form of timestamps on the package descriptions in the index tar.

Member

ttuegel commented Aug 12, 2015

@23Skidoo I've investigated that before, and yes, release dates are available in the form of timestamps on the package descriptions in the index tar.

@lspitzner

This comment has been minimized.

Show comment
Hide comment
@lspitzner

lspitzner Aug 12, 2015

Collaborator

@snoyberg

equating [technically correct] with [not a bug] has been a large source of frustration with cabal

cabal is a framework/software. it does not equate. who is the indented recipient of this message? I? The cabal maintainers? If the former, i quote from my initial reply:

It seems to me that cabal works correctly. Otherwise I would expect some reasoning about what is supposedly done wrong.

So I never said that this definitely is not a bug/that the implementation could not be improved to handle such a situation better. I left room for more concrete input. The criticism seems inappropriate.

And regardless of interpretation, such vague criticism is not constructive.

Collaborator

lspitzner commented Aug 12, 2015

@snoyberg

equating [technically correct] with [not a bug] has been a large source of frustration with cabal

cabal is a framework/software. it does not equate. who is the indented recipient of this message? I? The cabal maintainers? If the former, i quote from my initial reply:

It seems to me that cabal works correctly. Otherwise I would expect some reasoning about what is supposedly done wrong.

So I never said that this definitely is not a bug/that the implementation could not be improved to handle such a situation better. I left room for more concrete input. The criticism seems inappropriate.

And regardless of interpretation, such vague criticism is not constructive.

@snoyberg

This comment has been minimized.

Show comment
Hide comment
@snoyberg

snoyberg Aug 13, 2015

Collaborator

@lspitzner Please look at the context of my comment, it was clearly not pointed at you, nor was it a vague criticism, but an actual suggestion for a change in approach to which issues are treated by the cabal team as bugs and which aren't. In fact, the rest of my comment was very concrete on how I think cabal could do better here. I'm actually not sure how someone who read my entire comment could possibly interpret it as a vague criticism.

Collaborator

snoyberg commented Aug 13, 2015

@lspitzner Please look at the context of my comment, it was clearly not pointed at you, nor was it a vague criticism, but an actual suggestion for a change in approach to which issues are treated by the cabal team as bugs and which aren't. In fact, the rest of my comment was very concrete on how I think cabal could do better here. I'm actually not sure how someone who read my entire comment could possibly interpret it as a vague criticism.

@hvr

This comment has been minimized.

Show comment
Hide comment
@hvr

hvr Aug 14, 2015

Member

@ttuegel IIRC the release dates are not reliably available in the 00-index tarball, as they get shadowed by cabal edits

PS: I think I've stumbled over a couple of other pitfalls you'd run into by relying on timestamps for the solver, but I need to think it through a bit more... I've been toying with the idea of using timestamps for guiding supervised .cabal edits, as that's the one place where I can see that information being useful as a first-order approximation of the build-depends-truth

Member

hvr commented Aug 14, 2015

@ttuegel IIRC the release dates are not reliably available in the 00-index tarball, as they get shadowed by cabal edits

PS: I think I've stumbled over a couple of other pitfalls you'd run into by relying on timestamps for the solver, but I need to think it through a bit more... I've been toying with the idea of using timestamps for guiding supervised .cabal edits, as that's the one place where I can see that information being useful as a first-order approximation of the build-depends-truth

@hvr

This comment has been minimized.

Show comment
Hide comment
@hvr

hvr Aug 14, 2015

Member

@tebello-thejane for future reference, this sounds a lot like the kind of issues the Hackage Trustee issue tracker over at https://github.com/haskell-infra/hackage-trustees/issues is for as this isn't cabal's fault (even though I see a point that the cabal UI could be improved to make it more clear what the problem is) but rather the result of package authors omitting proper version bounds

Member

hvr commented Aug 14, 2015

@tebello-thejane for future reference, this sounds a lot like the kind of issues the Hackage Trustee issue tracker over at https://github.com/haskell-infra/hackage-trustees/issues is for as this isn't cabal's fault (even though I see a point that the cabal UI could be improved to make it more clear what the problem is) but rather the result of package authors omitting proper version bounds

@kosmikus

This comment has been minimized.

Show comment
Hide comment
@kosmikus

kosmikus Aug 14, 2015

Contributor

@23Skidoo You could do me and others who occasionally work on the solver a huge favour if you'd turn the enhancement request to use package timestamps for the solver into a new issue. It's difficult to figure out what's actually the reason why this is still open from a bug with a lot of discussion, where the important info is buried deep inside a few comments. The new bug can of course reference this one.

Contributor

kosmikus commented Aug 14, 2015

@23Skidoo You could do me and others who occasionally work on the solver a huge favour if you'd turn the enhancement request to use package timestamps for the solver into a new issue. It's difficult to figure out what's actually the reason why this is still open from a bug with a lot of discussion, where the important info is buried deep inside a few comments. The new bug can of course reference this one.

@tebello-thejane

This comment has been minimized.

Show comment
Hide comment
@tebello-thejane

tebello-thejane Aug 14, 2015

I was not aware of the Trustee issue tracker, @hvr. Thanks for the link.

tebello-thejane commented Aug 14, 2015

I was not aware of the Trustee issue tracker, @hvr. Thanks for the link.

@ttuegel

This comment has been minimized.

Show comment
Hide comment
@ttuegel

ttuegel Aug 15, 2015

Member

@ttuegel IIRC the release dates are not reliably available in the 00-index tarball, as they get shadowed by cabal edits

@hvr You're quite right! I forgot momentarily that tar only saves last modification time and not creation time. I think there has been talk of using the package index as append-only; at least this would let us back out the creation time, albeit at the cost of traversing the whole archive. The truth is, tar is being pushed pretty far as a database format here.

PS: I think I've stumbled over a couple of other pitfalls you'd run into by relying on timestamps for the solver, but I need to think it through a bit more... I've been toying with the idea of using timestamps for guiding supervised .cabal edits, as that's the one place where I can see that information being useful as a first-order approximation of the build-depends-truth

I don't think the idea here is to rely on the timestamps in general, but only to use them to prioritize conflicting constraints; i.e. prefer constraints that rule out older packages to constraints that rule out newer ones.

Member

ttuegel commented Aug 15, 2015

@ttuegel IIRC the release dates are not reliably available in the 00-index tarball, as they get shadowed by cabal edits

@hvr You're quite right! I forgot momentarily that tar only saves last modification time and not creation time. I think there has been talk of using the package index as append-only; at least this would let us back out the creation time, albeit at the cost of traversing the whole archive. The truth is, tar is being pushed pretty far as a database format here.

PS: I think I've stumbled over a couple of other pitfalls you'd run into by relying on timestamps for the solver, but I need to think it through a bit more... I've been toying with the idea of using timestamps for guiding supervised .cabal edits, as that's the one place where I can see that information being useful as a first-order approximation of the build-depends-truth

I don't think the idea here is to rely on the timestamps in general, but only to use them to prioritize conflicting constraints; i.e. prefer constraints that rule out older packages to constraints that rule out newer ones.

@23Skidoo

This comment has been minimized.

Show comment
Hide comment
@23Skidoo

23Skidoo Aug 15, 2015

Member

I think there has been talk of using the package index as append-only; at least this would let us back out the creation time, albeit at the cost of traversing the whole archive.

It needs to be traversed anyway when creating the index cache; the upload date could be saved in the cache as well.

Member

23Skidoo commented Aug 15, 2015

I think there has been talk of using the package index as append-only; at least this would let us back out the creation time, albeit at the cost of traversing the whole archive.

It needs to be traversed anyway when creating the index cache; the upload date could be saved in the cache as well.

@ttuegel

This comment has been minimized.

Show comment
Hide comment
@ttuegel

ttuegel Aug 15, 2015

Member

It needs to be traversed anyway when creating the index cache; the upload date could be saved in the cache as well.

Yes, but if the archive is append only, it becomes much larger, right? By a factor of the average number of package revisions.

Member

ttuegel commented Aug 15, 2015

It needs to be traversed anyway when creating the index cache; the upload date could be saved in the cache as well.

Yes, but if the archive is append only, it becomes much larger, right? By a factor of the average number of package revisions.

@hvr

This comment has been minimized.

Show comment
Hide comment
@hvr

hvr Aug 15, 2015

Member

Right, with a persistent package-index this should work again, assuming the history won't be truncated.

Btw, can we please create a new focused ticket like @kosmikus suggested, and more importantly specify exactly how the timestamps are to affect the solving process? That'd make it easier to reason about the semantics and test it on paper by trying to come up with counter-examples.

Member

hvr commented Aug 15, 2015

Right, with a persistent package-index this should work again, assuming the history won't be truncated.

Btw, can we please create a new focused ticket like @kosmikus suggested, and more importantly specify exactly how the timestamps are to affect the solving process? That'd make it easier to reason about the semantics and test it on paper by trying to come up with counter-examples.

@hvr

This comment has been minimized.

Show comment
Hide comment
@hvr

hvr Aug 15, 2015

Member

@ttuegel some numbers:

We currently have 8565 packages, 58397 package-versions, 4574 of those package-versions have a non-zero x-rev, and in total there were 5543 cabal edits so far.

Right now if we created an uncompressed index-tarball with full history, it'd not be "much larger" but rather just roughly 10% larger. Compressed with an algorithm with a large enough dictionary and/or window-size should be able to bring the relative 10% increase down a bit.

Member

hvr commented Aug 15, 2015

@ttuegel some numbers:

We currently have 8565 packages, 58397 package-versions, 4574 of those package-versions have a non-zero x-rev, and in total there were 5543 cabal edits so far.

Right now if we created an uncompressed index-tarball with full history, it'd not be "much larger" but rather just roughly 10% larger. Compressed with an algorithm with a large enough dictionary and/or window-size should be able to bring the relative 10% increase down a bit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment