compiling big files?
Malcolm Wallace
Malcolm.Wallace@cs.york.ac.uk
Wed, 30 Oct 2002 11:56:18 +0000
I have occasional problems getting ghc-5.04.1 to compile a very large
source file with -O switched on. The source file is generated by
another tool, and is rather large at 25762 lines (2776340 bytes).
Without -O, the compilation goes normally, taking about 10 minutes or so.
With -O, the compiler uses >280Mb of memory, which causes my machine
with only 256Mb to swap like crazy, but that is OK, because after about
half an hour, ghc produces its output in a .hc file. The C compiler
(gcc-2.95.3) then produces a .raw_s file:
$ wc /tmp/ghc*
759338 837040 18321347 /tmp/ghc3992.hc
1711905 3174489 32014101 /tmp/ghc3992.raw_s
It's the next stage, when (I presume) the ghc-mangler converts the
.raw_s file to a simpler .s file, that doesn't run to completion.
It simply reports
Killed
make: *** [targets/ix86-Linux/hat-lib-ghc] Error 2
It seems that if I re-run ghc on the intermediate .hc file that
was left behind in the /tmp directory, then the mangler is ok,
and assembly proceeds right up to the .o file.
So is there any (fixable) reason why the one-step invocation fails,
but the two-step invocation succeeds?
Regards,
Malcolm