cabal haddock plus cpp preprocessing with relative #include files #259

bos opened this Issue May 24, 2012 · 5 comments


None yet

4 participants

bos commented May 24, 2012

(Imported from Trac #266, reported by guest on 2008-04-09)

All these bugs are against the homeomorphic library I have written. To
play along, you can do

darcs get --partial

Bug 2: cabal haddock fails

$ cabal haddock
Preprocessing library homeomorphic-0.1...
Running Haddock for homeomorphic-0.1...
Warning: The documentation for the following packages are not installed.
No links will be generated to these packages: base-,
QuickCheck?-, mtl-, containers-


Include/Hash.hs: No such file or directory

This one may well be my fault, if some extra magic is required for the
haddock with CPP, but not the building - but that seems weird, if
there is enough info to build the file, surely there is enough info to
haddock it

Duncan says:

Right, there is no such guarantee at the moment. That's because ghc
--make can find things by import chasing where Cabal needs more stuff
specified explicitly. One day when Cabal does the import chasing they'll
be consistent.

In this case though it doesn't look like on of those problems. It seems
to be because for haddock we're copying files to dist/ and then running
cpp on them. That means of course that relative #includes do not work.
It's not entirely clear to me why we are copying them before running cpp
and not just running cpp on the files from their original locations.

Please file a ticket for this one.

bos commented May 24, 2012

(Imported comment by @dcoutts on 2008-04-09)

So the problem is that when using haddock-0.x we copy all .hs files into a temporary directory before running cpp on them. That's the source of the issue, because cpp includes are relative to the location of the file (as well as -I search path). So when we copy the .hs file but not the file that it #includes then it cannot be found.

The solution I think is to not copy them, but to cpp the file from it's original location directly to the destination directory. For .lhs files, we should cpp first and then unlit, not the other way around.

I'm a bit nervous about making this change in a stable release. I'm tempted to punt it to Cabal-1.8. Generally the whole haddock module needs rewriting. It's a big jumbled inconsistent mess, especially the way it handles pre-processing and the differences between haddock 0.x and 2.x.

bos commented May 24, 2012

(Imported comment by @saizan on 2009-01-22)

This should be fixed in Cabal HEAD by

Thu Feb 19 16:37:38 CET 2009  Andrea Vezzosi <>
  * rewrite of Distribution.Simple.Haddock
  In addition to (hopefully) making clear what's going on
  we now do the additional preprocessing for all the versions of haddock
  (but not for hscolour) and we run cpp before moving the files.
bos commented May 24, 2012

(Imported comment by @rrnewton on 2009-03-17)

I just tripped across this problem with cabal-install 0.10.2 and haddock 2.9.2.

I'm not sure if this means it hasn't been fixed or wasn't covered by the fix mentioned above (2.5 years ago).

rrnewton commented Apr 1, 2014

Update: I just tripped across the same thing with cabal-install (and Cabal 1.18.0), plus GHC 7.6.3:

$ cabal haddock
     fatal error: ./Vec2Common.hs: No such file or directory
     #include "./Vec2Common.hs"

But it's not so bad because in this case I can just get rid of the relative includes and move them to a global location, and then add an "Include-Dirs:" line for that location.


I don't think relative include dirs are all that well-specified in the C or C++ standards? As I recall people advocate having a relatively unique directory name for all your include files and prefixing your #include's with that.

Suggest closing.

@ttuegel ttuegel closed this Apr 18, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment