Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update stackage resolver to LTS 21.6 #2275

Merged
merged 9 commits into from
Aug 11, 2023
Merged

Update stackage resolver to LTS 21.6 #2275

merged 9 commits into from
Aug 11, 2023

Conversation

paulcadman
Copy link
Collaborator

@paulcadman paulcadman commented Aug 7, 2023

Stack LTS 21.6 uses GHC 9.4.5, binaries for HLS are available via ghcup.

Changes required:

  1. Fix warnings about type level : and [] used without backticks.

  2. Fix warnings about deprecation of builtin ~ - replaced with import Data.Type.Equality ( type (~) ) in the Prelude

  3. SemVer is no longer a monoid

  4. path-io now contains the AnyPath instances we were defining (thanks to Jan) so they can be removed.

  5. Added aeson-better-errors-0.9.1.1 as an extra-dep. The reason it is not part of the resolver is only because it has a strict bound on base which is not compatible with ghc 9.4.5. To work around this I've set:

    allow-newer: true
    allow-newer-deps:
      - aeson-better-errors
    

    which relaxed the upper constraint bounds for aeson-better-errors only. When the base constraints have been updated we can remove this workaround.

  6. Use stack2cabal to generate the cabal.project file and to freeze dependency versions.

    https://www.stackage.org/lts-21.6/cabal.config now contains the
    constraint haskeline installed, which means that the version of haskeline that is globally installed with GHC 9.4.5 will be used, see:

    Constraints from cabal imports cannot yet be overridden so it's not possible to get rid of this conflict using the import method. So we need to use stack2cabal with an explicit freeze file instead.

  7. Remove runTempFilePure as this is unused and depends on Polysemy.Fresh in polysemy-zoo which is not available in the resolver. It turns out that it's not possible to use the Fresh effect in a pure context anyway, so it was not possible to use runTempFilePure for its original purpose.

  8. We now use https://github.com/benz0li/ghc-musl as the base container for static linux builds, this means we don't need to maintain our own Docker container for this purpose.

  9. The PR for the nightly builds is ready Use ghc-musl instead of our own Docker repo juvix-nightly-builds#2, it should be merged as soon as this PR is merged.

Thanks to @benz0li for maintaining https://github.com/benz0li/ghc-musl and (along with @TravisCardwell) for help with building the static binary.

@paulcadman paulcadman added the ghc label Aug 7, 2023
@paulcadman paulcadman added this to the 0.4.3 milestone Aug 7, 2023
@paulcadman paulcadman self-assigned this Aug 7, 2023
@paulcadman paulcadman force-pushed the stack-lts-21.6 branch 2 times, most recently from b9bc2d2 to 21a768e Compare August 7, 2023 10:37
@jonaprieto
Copy link
Collaborator

jonaprieto commented Aug 7, 2023

To bump the LTS version, we are using the checklist below as a guide:

  • Update Stack resolver in stack.yaml
  • Modify tested-with section in package.yaml
  • Update Linux Github Action workflow in .github/workflows/linux-static-binary.yaml and adjust docker/Dockerfile-ghc-alpine-9.2.7 (now 9.4.5)
  • Revise GHC/Stack/Cabal versions in .devcontainer/Dockerfile
  • Refresh Cabal configuration in cabal-project
  • Update the nightly release workflow, example change

@paulcadman paulcadman marked this pull request as draft August 7, 2023 12:48
@paulcadman
Copy link
Collaborator Author

I attempted to use the glcr.b-data.ch/ghc/ghc-musl:9.4.5 Docker images for the static linux build but I got linker errors, so we should probably continue to use our own Docker GHC images.

/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: /usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/crtbeginT.o: relocation R_X86_64_32 against hidden symbol `__TMC_END__' can not be used when making a shared object
[8506](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8507)
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: failed to set dynamic section sizes: bad value
[8507](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8508)
collect2: error: ld returned 1 exit status
[8508](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8509)

[8509](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8510)
<no location info>: error:
[8510](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8511)
    `gcc' failed in phase `Linker'. (Exit code: 1)

@paulcadman
Copy link
Collaborator Author

I also see the linker errors when we build Juvix with a build of GHC 9.4.5 using our Dockerfile.

ld.lld: error: can't create dynamic relocation R_X86_64_32 against symbol: __TMC_END__ in readonly segment; recompile object files with -fPIC or pass '-Wl,-z,notext' to allow text relocations in the output
[8506](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8507)
>>> defined in /usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/crtend.o
[8507](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8508)
>>> referenced by crtstuff.c
[8508](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8509)
>>>               /usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/crtbeginT.o:(.text+0x1)

In addition there's the following error, which looks like a GHC compiler bug.

src/Juvix/Compiler/Internal/Translation/FromInternal/Analysis/Termination/Data/FunctionCall.hs: warning: [-Wmissed-specialisations]
[8726](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8727)
    Could not specialise imported function ‘ghc-prim:GHC.Classes.$fEq[]_$c==’
[8727](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8728)
      when specialising ‘ghc-prim:GHC.Classes.$fEq[]_$c/=’
[8728](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8729)
    Probable fix: add INLINABLE pragma on ‘ghc-prim:GHC.Classes.$fEq[]_$c==’
[8729](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5798356646/job/15715908800#step:8:8730)

@paulcadman
Copy link
Collaborator Author

I raised the linker issue with ghc-musl benz0li/ghc-musl#1

@benz0li
Copy link
Contributor

benz0li commented Aug 9, 2023

I attempted to use the glcr.b-data.ch/ghc/ghc-musl:9.4.5 Docker images for the static linux build but I got linker errors, so we should probably continue to use our own Docker GHC images.

/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: /usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/crtbeginT.o: relocation R_X86_64_32 against hidden symbol `__TMC_END__' can not be used when making a shared object
[8506](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8507)
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: failed to set dynamic section sizes: bad value
[8507](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8508)
collect2: error: ld returned 1 exit status
[8508](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8509)

[8509](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8510)
<no location info>: error:
[8510](https://github.com/paulcadman/juvix-nightly-builds/actions/runs/5786629426/job/15681767981#step:10:8511)
    `gcc' failed in phase `Linker'. (Exit code: 1)

Did this also happen with glcr.b-data.ch/ghc/ghc-musl:9.2.x?

@paulcadman
Copy link
Collaborator Author

Did this also happen with glcr.b-data.ch/ghc/ghc-musl:9.2.x?

When I try with ghc-musl:9.2.7 with the current main branch of juvix, I get a different error:

/juvix # stack install --system-ghc --no-install-ghc
musl libc (x86_64)
Version 1.2.3
Dynamic Program Loader
Usage: /lib/ld-musl-x86_64.so.1 [options] [--] pathname
Error: [S-6362]
No compiler found, expected minor version match with ghc-9.2.7 (x86_64) (based on resolver setting in /juvix/stack.yaml).
To install the correct GHC into /root/.stack/programs/x86_64-linux/, try running 'stack setup' or use the '--install-ghc' flag. To use your system GHC installation, run 'stack config set system-ghc --global true', or use the '--system-ghc' flag.
/juvix # ghc --version
The Glorious Glasgow Haskell Compilation System, version 9.2.7

For some reason stack does not detect that the system ghc matches the one required by the resolver. The versions seem to match, so I'm not sure what's wrong.

@benz0li
Copy link
Contributor

benz0li commented Aug 9, 2023

3. You would have to use Cabal (the tool) instead of Stack

#2166 (comment)

because of

Where exactly are the bindists? Stack needs URLs to GHC bindists. Docker images are not enough.

commercialhaskell/stack#6141 (comment)

and thus no official support.


Does it work when you use cabal?

@benz0li
Copy link
Contributor

benz0li commented Aug 9, 2023

P.S.: There might be official support for AArch64 at some point: https://gitlab.haskell.org/ghc/ghc/-/issues/23482

@paulcadman
Copy link
Collaborator Author

  1. You would have to use Cabal (the tool) instead of Stack

#2166 (comment)

because of

Where exactly are the bindists? Stack needs URLs to GHC bindists. Docker images are not enough.

commercialhaskell/stack#6141 (comment)

and thus no official support.

I thought this only applied to aarch64, we are building on x86_64?

Does it work when you use cabal?

I can try it, but we cannot use cabal for our release because we use https://hackage.haskell.org/package/gitrev to inject git revision info into the binary. This does not work with cabal v2 style builds.

@benz0li
Copy link
Contributor

benz0li commented Aug 9, 2023

I thought this only applied to aarch64, we are building on x86_64?

👍 I stand corrected.

@paulcadman
Copy link
Collaborator Author

Using cabal with the --enable-executable-static flag works without linker errors using glcr.b-data.ch/ghc/ghc-musl:9.4.5. So we just need a way to inject the git info into the binary, or remove this feature to unblock this.

@benz0li
Copy link
Contributor

benz0li commented Aug 9, 2023

Using cabal with the --enable-executable-static flag works without linker errors using glcr.b-data.ch/ghc/ghc-musl:9.4.5. So we just need a way to inject the git info into the binary, or remove this feature to unblock this.

It also works with stack. I will post the solution asap over at benz0li/ghc-musl#1.

https://www.stackage.org/lts-21.6/cabal.config now contains the
constraint `haskeline installed`. GHC 9.4.5 comes with haskeline 0.8.2
preinstalled but our configuration contains the
source-repository-package for haskeline 0.8.2.1 so if you try to run
`cabal build` you get the following conflict:

```
Resolving dependencies...
Error: cabal: Could not resolve dependencies:
[__0] next goal: haskeline (user goal)
[__0] rejecting: haskeline-0.8.2.1 (constraint from project config
https://www.stackage.org/lts-21.6/cabal.config requires installed instance)
[__0] rejecting: haskeline-0.8.2/installed-0.8.2, haskeline-0.8.2,
haskeline-0.8.1.3, haskeline-0.8.1.2, haskeline-0.8.1.1, haskeline-0.8.1.0,
haskeline-0.8.0.1, haskeline-0.8.0.0, haskeline-0.7.5.0, haskeline-0.7.4.3,
haskeline-0.7.4.2, haskeline-0.7.4.1, haskeline-0.7.4.0, haskeline-0.7.3.1,
haskeline-0.7.3.0, haskeline-0.7.2.3, haskeline-0.7.2.2, haskeline-0.7.2.1,
haskeline-0.7.2.0, haskeline-0.7.1.3, haskeline-0.7.1.2, haskeline-0.7.1.1,
haskeline-0.7.1.0, haskeline-0.7.0.3, haskeline-0.7.0.2, haskeline-0.7.0.1,
haskeline-0.7.0.0, haskeline-0.6.4.7, haskeline-0.6.4.6, haskeline-0.6.4.5,
haskeline-0.6.4.4, haskeline-0.6.4.3, haskeline-0.6.4.2, haskeline-0.6.4.1,
haskeline-0.6.4.0, haskeline-0.6.3.2, haskeline-0.6.3.1, haskeline-0.6.3,
haskeline-0.6.2.4, haskeline-0.6.2.3, haskeline-0.6.2.2, haskeline-0.6.2.1,
haskeline-0.6.2, haskeline-0.6.1.6, haskeline-0.6.1.5, haskeline-0.6.1.3,
haskeline-0.6.1.2, haskeline-0.6.1.1, haskeline-0.6.1, haskeline-0.6.0.1,
haskeline-0.6, haskeline-0.5.0.1, haskeline-0.5, haskeline-0.4,
haskeline-0.3.2, haskeline-0.3.1, haskeline-0.3, haskeline-0.2.1,
haskeline-0.2 (constraint from user target requires ==0.8.2.1)
[__0] fail (backjumping, conflict set: haskeline)
After searching the rest of the dependency tree exhaustively, these were the
goals I've had most trouble fulfilling: haskeline
```

Constraints from cabal imports cannot yet be overridden so it's not
possible to get rid of this conflict.

To work around this we instead use stack2cabal to generate a
cabal.project.freeze file for dependencies. This uses
https://www.stackage.org/lts-21.6/cabal.config at generation time to fix
the dependency versions
The build additionally depends on alex, happy and python3
The static flag sets `ld-options` required to produce a statically
linked juvix binary.

We should be able to set --ghc-options='-optl-static' to get the same
effect but this causes linker errors.
@jonaprieto
Copy link
Collaborator

Amazing job @paulcadman.
Special thanks to @benz0li!

@jonaprieto jonaprieto merged commit 46ab163 into main Aug 11, 2023
4 checks passed
@jonaprieto jonaprieto deleted the stack-lts-21.6 branch August 11, 2023 09:49
@benz0li
Copy link
Contributor

benz0li commented Aug 13, 2023

Did this also happen with glcr.b-data.ch/ghc/ghc-musl:9.2.x?

When I try with ghc-musl:9.2.7 with the current main branch of juvix, I get a different error:

/juvix # stack install --system-ghc --no-install-ghc
musl libc (x86_64)
Version 1.2.3
Dynamic Program Loader
Usage: /lib/ld-musl-x86_64.so.1 [options] [--] pathname
Error: [S-6362]
No compiler found, expected minor version match with ghc-9.2.7 (x86_64) (based on resolver setting in /juvix/stack.yaml).
To install the correct GHC into /root/.stack/programs/x86_64-linux/, try running 'stack setup' or use the '--install-ghc' flag. To use your system GHC installation, run 'stack config set system-ghc --global true', or use the '--system-ghc' flag.
/juvix # ghc --version
The Glorious Glasgow Haskell Compilation System, version 9.2.7

For some reason stack does not detect that the system ghc matches the one required by the resolver. The versions seem to match, so I'm not sure what's wrong.

Resolved for glcr.b-data.ch/ghc/ghc-musl:9.2.8. See issue benz0li/ghc-musl#3 for more information.

(glcr.b-data.ch/ghc/ghc-musl:9.4.6 and glcr.b-data.ch/ghc/ghc-musl:9.6.2 are built using the Hadrian build system.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use GHC docker images from benz0li / ghc-musl to build linux binaries
3 participants