CPP confusion
Duncan Coutts
duncan.coutts at worc.ox.ac.uk
Tue Oct 16 05:46:26 EDT 2007
All,
There seems to me to be a great deal of confusion with CPP options in
Haskell land. We apply the same cpp options to C code and to Haskell
code, and there are other tools like pre-processors which make things
even more confusing.
The latest thing I've stumbled over is HsUnix.h in ghc-6.8. This header
file is for defines to use when pre-processing Haskell code. It used to
live in the global $GHCLIBDIR/include but now lives in the include dir
for the unix package. This is probably the right thing to do, however...
How do packages that want to use this include actually use it? If they
#include it directly in a .hs file then it'll work since ghc -package
unix adds the unix package's include dir to the search path. But what if
someone needs to use it in a .hsc file?
Well, then it does not work. It does not work because Cabal does not
pass these include directories to hsc2hs, since we've relied on ghc to
do that. But obviously that does not work for hsc2hs. As it happens the
fork of hsc2hs that ghc comes bundled with uses ghc as it's C compiler,
so we could in fact pass hsc2hs --cflag=-package --cflag=unix then it
does work because hsc2hs ends up passing ghc -package unix and then ghc
adds the unix's include dir to the search path when it goes on to call
gcc to actually compile the C code.
Yay, for many layers of confusion.
So what is the solution? We could take advantage of the fact that we
know that hsc2hs really uses ghc as it's C compiler and change the flags
we pass it when we happen to be building with ghc. On the other hand
this all becomes non-portable. In fact this totally relies on a package
database to remember what search directories to use when pre-processing
dependent packages. In other words it only has a chance of working with
ghc at the moment, or nhc in future if/when it implements a package
database.
What to do?
Short and long term suggestions please.
Duncan
More information about the cabal-devel
mailing list