Get rid of ghc-cabal and package-data.mk #18
Comments
Another example of accessing the information stored in |
My understanding is, that we would basically absorb |
@angerman Yes, that's the intention. |
@ndmitchell @angerman I'd appreciate your help with |
I'm not at a machine with MSYS/Mingw in the next 24 hours, so I'm not much use I'm afraid. |
I'll give this a try. |
Alright, I've tried to give this a go. But I think this is not possible, I believe we run into this: https://ghc.haskell.org/trac/ghc/ticket/10514. We simply can not derive
using
While I do not see |
Can't you just write your own Hashable by hand? It shouldn't be too difficult. |
Did I ever mention I have no idea what I'm doing? ;-) We also need |
You can do deriving Show on a GADT, and then you can just make Hashable by doing |
This is what we have in -- Instances for storing in the Shake database
instance Binary Way where
put = put . show
get = fmap read get
instance Hashable Way where
hashWithSalt salt = hashWithSalt salt . show
instance NFData Way where
rnf (Way s) = s `seq` () |
@ndmitchell thanks! @snowleopard if we had |
Hmm. We could of course just go ahead and implement read for each and every single one on our own ... |
@snowleopard, if you're going to do that, you might as well do: |
You can (kind of) data Hidden :: (* -> *) -> * where
Hide :: gadt index -> Hidden gadt So if you have data MyGadt :: * -> * where
C0 :: MyGadt ()
C1 :: a -> MyGadt a You can write a On 1/12/16, Neil Mitchell notifications@github.com wrote:
|
I wonder if the GADTs are an essential part of this patch, or could be removed? They seem to be causing a lot of problems. |
Well. I must admit, I've hit a wall trying to get |
I am pretty sure GADTs can be dropped at the cost of losing some type-safety. |
I've changed the milestone to |
Hey Cabal dev here. Are there things we can do to help? I think this is definitely a place where we can take changes to the Cabal library to make your life easier. |
Hey @ezyang! Sure, any help would be great. Basically: we'd like to integrate Cabal into Hadrian in a more natural way: as a Haskell library, instead of calling |
I feel like we may have had this discussion before, but if you access Cabal directly as a library, how are you going to make sure Hadrian gets built against the most recent (bootstrap) version of Cabal? Assuming you have the correct version of Cabal, I think you just want a Shake rule for the configure step (not quite just A more ambitious thing to do is completely Shake-ify Cabal's build system, and then slot in those rules into Hadrian. Probably not now! |
We already use Cabal library (see
Yep, this sounds right. |
So, I recently circled back to thinking about this problem. The stated reason why we can't eliminate stage0/stage1/stage2 directories is because ghc-cabal hard codes where it places package-data.mk files. But I don't see any reason why we couldn't add a new flag to ghc-cabal to output the data, e.g., to stdout, where we could slurp it up directly, solving point (1). I reviewed the PR at #166 (is there more significant code?) and I did not see any reason to believe that issues (2) or (3) could be addressed without significantly expanding the scope of the Cabal changes. Indeed, for example, see #170 for an example of where you MUST modify the Cabal library code, as it makes assumptions about the current working directory which you cannot easily manage in process. Another example is the calls to To be clear, it would be very nice if we could rewrite Cabal's build system (in the Cabal library) in Shake. It's something that I've had my eye on for a while. But in my eye, it makes sense to shave that yak first, and then integrate those rules with Hadrian (we'll have to work a bit to make sure the two build systems compose.) |
@ezyang Thanks for your input!
Yes, something like this is indeed possible, although
Yes, I believe this is the biggest effort on this.
Oh, I didn't realise that. So, you are saying the only way we can solve this issue is to change the Cabal library itself? That's definitely not what we had in mind.
I am a bit confused here. In this ticket we do not want to pursue this goal. All we want to do is to teach Hadrian to extract package metadata from |
@snowleopard the idea is rather than Hadrian scraping text files, it could link Cabal directly and use it to generate shake rules---the same shake Cabal it itself would use. Very elegant! |
Also see haskell/cabal#4047 . The plan there would teach cabal-install about different ghc's for purpose's of cross compiling, but the same logic is equally applicable for bootstrapping. A shaked-up Cabal naturally leads to a Shaked-up cabal-install, yielding the same slick integration as described above. [I believe it's also a goal to factor out the solver from cabal-install as its own library.] |
@Ericson2314 I see, this would be great and elegant, but how soon is it going to happen? Do we want to depend on this future Cabal feature in this project? My intuitive answer is No. I think you (and @ezyang) are saying that it just doesn't make sense to invest time in solving this issue until we have a better Cabal library, which does sound reasonable. Perhaps, we shouldn't bother indeed. This issue is not on the crititical path for Hadrian to replace the old build system, so I'm fine if we decide not to pursue it until it becomes trivial to resolve with shakified Cabal. |
I'm not in a position propose whether a stopgap is worth it, just wanted to point out that eventually things can be really really slick.
Do you mean depend on linking Cabal before Cabal is rewritten with Shake? I agree that would be silly. I assume from the rest of your post that you would like to use it once Cabal is so rewritten. |
@Ericson2314 Sorry, I didn't express myself well. I meant we don't want to wait for the shakified Cabal before releasing Hadrian as the main build system of GHC, i.e. we don't want to depend on it just yet.
Yes, it sounds promising. |
Right. And unfortunately, to get the metadata, we have to run the equivalent of a
Yes, it is not going to happen for a while, so don't block on it. But on the other hand, I don't think it is an insurmountably complex project (ignoring Custom builds, for the moment.) I might make an attempt at this over Christmas break. One big question I have, though, is how to setup any such build system so that it can be integrated with Hadrian. Hadrian has its own DSL going on for assembling command line calls and I wonder if that should be factored into a library that cabal-shake could use. |
@ezyang Ah, yes! We do want to factor out the generic DSL part from Hadrian into a separate library. I didn't prioritise this but I can look into it soon (December-ish) if you would like to use it. |
It's not that urgent on my end; what I will probably do if I actually attack this problem is to just copy paste some of the low level combinators into my implementation, with the idea of dropping it for the library later. ;) |
Done in #531 🎉 |
A large part of the build system is dedicated to dancing around
package-data.mk
files containing package-related data (such as package name, version, dependencies, etc.). These files are generated byutils/ghc-cabal
program, which also needs to be built in a non-trivial way. SeeRules/Data.hs
andOracles/PackageData.hs
in particular.Our long term plan is to eliminate this and get rid of
ghc-cabal
andpackage-data.mk
files altogether. SeeRules/Cabal.hs
whereDistribution.Package
is used instead ofghc-cabal
to extract package dependencies directly from.cabal
files.A potential solution will need to be carefully thought through and discussed. This is the thread to do this.
Let me formulate the rationale behind this more clearly:
ghc-cabal
createspackage-data.mk
files in a fixed location inside a package directory. This prevents us from moving build artefacts outside the source tree Move build products out of the ghc tree #113.ghc-cabal
always runs expensive configure scripts, significantly affecting performance of the whole build system. On top of that, parallel invocations ofghc-cabal
are currently broken.ghc-cabal
program built in a non-trivial way andpackage-data.mk
files that need to be parsed significantly adds to the complexity of the build system and hence makes it more difficult to understand and maintain. Right now this is undoubtedly the most complicated and unreliable part of the build system.The text was updated successfully, but these errors were encountered: