[Haskell-cafe] exceeding resources with GHC compiling (Was: Problem with publishing project/packages)
Henning Thielemann
lemming at henning-thielemann.de
Tue Apr 28 12:12:18 UTC 2020
On Tue, 28 Apr 2020, Ben Franksen wrote:
> Today I ran again into a problem I had several times before: compiling
> Cabal-3.2.* (the library) with ghc-8.2.2 and cabal with default options
> (including jobs: $ncpu, though it actually used only one cpu) eats all
> the memory on my machine (8GB, but I had a tor browser and another
> browser and thunderbird running) so that it completely freezes (no
> mouse, no keyboard). Had to reboot using sysrq escape hatch. Not funny.
> I think this is due to use of ghc --make and some very large modules.
> Thankfully memory use has improved with later ghc versions.
That's why I never use 'jobs: $ncpu' and also oppose to use this as the
default setting. [1] There are some modules in packages that frequently
eat up all my resources, e.g. I know that building "Cabal the library" is
such a package. I remember that I can save memory by aborting compilation
and restart it. It seems that GHC may cache too much. But continuing an
aborted compilation is not possible for imported packages when using
'cabal install'. Other packages contain big modules automatically created
by Template Haskell.
[1] https://github.com/haskell/cabal/issues/5776
More information about the Haskell-Cafe
mailing list