compiling big files?

Simon Marlow simonmar@microsoft.com
Wed, 30 Oct 2002 14:24:59 -0000


> I have occasional problems getting ghc-5.04.1 to compile a very large
> source file with -O switched on.  The source file is generated by
> another tool, and is rather large at 25762 lines (2776340 bytes).
>=20
> Without -O, the compilation goes normally, taking about 10=20
> minutes or so.
>=20
> With -O, the compiler uses >280Mb of memory, which causes my machine
> with only 256Mb to swap like crazy, but that is OK, because=20
> after about
> half an hour, ghc produces its output in a .hc file.  The C compiler
> (gcc-2.95.3) then produces a .raw_s file:
>=20
>     $ wc /tmp/ghc*
>      759338  837040 18321347 /tmp/ghc3992.hc
>     1711905 3174489 32014101 /tmp/ghc3992.raw_s

eek! :-S

> It's the next stage, when (I presume) the ghc-mangler converts the
> .raw_s file to a simpler .s file, that doesn't run to completion.
> It simply reports
>=20
>     Killed
>     make: *** [targets/ix86-Linux/hat-lib-ghc] Error 2
>=20
> It seems that if I re-run ghc on the intermediate .hc file that
> was left behind in the /tmp directory, then the mangler is ok,
> and assembly proceeds right up to the .o file.
>=20
> So is there any (fixable) reason why the one-step invocation fails,
> but the two-step invocation succeeds?

I have no idea what might cause this.  If you put the .hc up for
download somewhere, I'll take a look.

In the meantime, you might want to try using the native code generator
instead.  Just add -fasm to the command line.

Cheers,
	Simon