Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial roadmap. #19

Merged
merged 3 commits into from
Dec 16, 2017
Merged

Initial roadmap. #19

merged 3 commits into from
Dec 16, 2017

Conversation

mboes
Copy link
Member

@mboes mboes commented Dec 15, 2017

Modeled like the Bazel roadmap.

I omitted for now items that are still speculative and therefore
haven't been scheduled yet. Two in particular come to mind:

  • Generate BUILD files from Cabal descriptions #17 Auto generation of BUILD files. Makes sense. But need to
    discuss: whether a Gazelle-like approach makes sense for Haskell,
    whether we want something more fine grained or more coarse grained.
  • Hackage dependencies: at the moment we delegate building what
    Stackage would call "snapshots" to other tools (namely Nix). It's
    unclear whether Bazel build rules want to be getting into that
    business and for what gain.

@mboes
Copy link
Member Author

mboes commented Dec 15, 2017

@Fuuzetsu could you fill in the "build and test inline-java" milestone based on the content of #14? And also start filling in what's needed for sparkle? Will merge after that.

@Fuuzetsu
Copy link
Collaborator

Did not touch sparkle yet, hopefully will get to that by end of today and can fill that in.

I'm not sure I agree on those priorities.

  • P0. Support passing compiler flags on command-line.

I don't actually think this is critical, in practice how often do you pass compiler flags on CLI that change constantly? It's important but don't know about blocker.

  • P1. GHC server to amortize cost of invoking ghc command.

I would put this as P2 or not even in 1.0. It's an optimisation that I'm not even convinced we'll want yet and even if we're convinced we do, it should be a separate project that acts as a drop-in GHC.

  • P1. Make object cache available to server to achieve incremental rebuild of haskell_library targets.

See above

  • P1. Import toolchains from rules_nixpkgs.

Do we want to force nix on rules_haskell users?

  • P2. Define official GHC bindists as toolchains for each Tier-1 platform.

I think we should have either the nixpkgs way or official bindist way with leaning towards latter.

  • P2. Define cross-compiler toolchains.

For 1.0? I don't think we should mess with that until native build story is completely solid…

  • P2. Support multiple build flavours: fastbuild, opt, dbg/profiling.

@mboes
Copy link
Member Author

mboes commented Dec 15, 2017

  • We can't ship a 1.0 without incremental rebuilds. We need to achieve that one way or another. And if the one way is to make the bazel dependency graph super fine grained, then the one-shot cost of GHC is going to be bad. We've seen that already. If it's a coarse grained dependency graph, then some build state needs to be maintained somehow. Both approaches call for a server. Whether this is a separate reusable project or not doesn't matter for the purposes of a roadmap: the need is still driven by rules_haskell.
  • There is no reason to force Nix on rules_haskell users. But it's a robust solution to the hermiticity problem. Other sources for toolchains can be supported too depending on user needs - in fact that's already part of the roadmap.
  • I think cross-compiling is one of the key value adds of a Bazel based build. It's not slated as a must have currently, but it's something e.g. @jml has expressed particular interest in. Note that we'll need to work with GHC HQ here to get cross-compiler bindists produced and kept up-to-date.

I think we should have either the nixpkgs way or official bindist way with leaning towards latter.

Why do we have to choose? The latter isn't a possibility to begin with on all platforms (e.g. NixOS).

I'll reduce priority on command-line args.

@Fuuzetsu
Copy link
Collaborator

After discussion off GitHub, I agree with most things here now. I would like to change some wording on incremental build part (we want an incremental build, not a specific solution right now). I got delayed and will only be able to add my parts tomorrow.

@jml
Copy link

jml commented Dec 15, 2017

Auto generation of BUILD files.

I reckon we should include this in the roadmap, even though we don't have all the details decided. We know we want to do it, after all.

@mboes
Copy link
Member Author

mboes commented Dec 15, 2017

Yes, better to talk about the objective only: incremental builds.

@johnynek
Copy link

I'm following along and very excited about this project.

I work on the scala rules for bazel (which we use in production at Stripe). I'd be interested in using these rules.

For incremental builds, I hope these rules just lean on bazel and don't keep any build state in workers. Bazel, as you probably know, allows you to have worker processes, but the intent is that build is still a pure function of inputs to outputs. I'm a bit nervous about safety of using bazel caching (which is a huge feature) if too much state is kept in the worker.

I hope we can just use small targets and let bazel's default approach for incrementalism work.

Thanks for your work on this!

@mboes
Copy link
Member Author

mboes commented Dec 15, 2017

@jml I myself don't know yet, but maybe you guys do? e.g. Java projects don't state source level dependencies in the Bazel graph, because the Java compiler figures them out. The same is true of GHC (ghc A.hs B.hs C.hs compiles B first if A and C depend on it). We do lose incrementality, but that's what the other topics are about: recovering incrementalism.

Or did you mean generate BUILD files to tell Bazel that given a package set (e.g. a Stackage snapshot), it should build package B before package A? See the point in the PR description about "Hackage dependencies". I'm just not sure about wanting to do that!

@mboes
Copy link
Member Author

mboes commented Dec 15, 2017

@johnynek really great to have you stop by! Rule granularity is perhaps the hardest issue we've had to grapple with so far - we've been going back and forth on this since we started about what the best choice would be for Haskell given the very slow compile times. So any experience report here would be great.

in rules_scala and at Stripe, what is the unit of compilation typically exposed to Bazel? Is the a single source file? Or a group of source files? Or a whole "package" (i.e. library)? If you are specifying dependencies between source files in BUILD files, doesn't that end up tiring? Do you therefore automate BUILD file generation?

I'd be interested in using these rules.

Just curious - you mean, to build Haskell code as well and not just the Scala/Java code you guys have at Stripe?

@johnynek
Copy link

We have usually 1-10 build rules per directory. Some put everything in the directory in one rule, some break it up somewhat. We have a macro that allows the user to manually declare the source dependencies and then presents a single target for external dependencies:

scala_module(
  name = "foo",
  src_deps = {
    "Foo.scala": [],
    "Bar.scala": ["Foo.scala", "Baz.scala"],
    "Baz.scala": [],
   },
  deps = [ ...
)

That makes a single target per file internally, but all those are private visibility, then the name exports all the internal targets. This has worked pretty well so far: not too verbose, but also great incrementalism and easy way to depend on the whole thing externally.

I talked to the bazel folks at bazelconf about bazel having a limited form of dynamic targets that would allow a tool to tell bazel these internal private targets. There was some interest to address slower compilers (like scala, C++, haskell, ...)

As for using these rules, @non and I wrote an internal tool in haskell. It would be nice to be able to build it with bazel like the rest of our builds rather than stack.

@Fuuzetsu Fuuzetsu merged commit 9de027f into master Dec 16, 2017
@Fuuzetsu Fuuzetsu deleted the roadmap branch December 16, 2017 18:02
Profpatsch pushed a commit that referenced this pull request Mar 7, 2019
Link cbits statically on Windows
Profpatsch pushed a commit that referenced this pull request Mar 7, 2019
Profpatsch pushed a commit that referenced this pull request Mar 7, 2019
Fix bogus #19 and check for '.' in include dirs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants