[Haskell-cafe] GHC 7.0.1 developer challenges

John D. Ramsdell ramsdell0 at gmail.com
Mon Dec 13 23:34:42 CET 2010

On Mon, Dec 13, 2010 at 10:17 AM, Mathieu Boespflug <mboes at tweag.net> wrote:
> Hi John,
> Why don't you use ulimit for this job?

By default, the GHC runtime will allocate memory beyond what it takes
for takes to cause thrashing on a Linux box.  However, if you give the
GHC runtime a limit with the -M option, and it wants too much memory,
the GHC runtime is smart enough not to ask for more, but to garbage
collect more often.  If you ulimit the GHC runtime, the process is
killed when it asks for too much memory, right?

I have enclosed a small script I contributed in another thread that
shows how I tested it.  If you run my cpsagraph program on my laptop
with a large, but not too large input, the program causes OS thrashing
and takes ten minutes to run.  If you limit the memory using, the
script chooses a limit around 750m, and the program completes in 48
seconds!  The top program shows that the program gets 100% of the CPU
during the fast run.  The script chooses the best memory limit, not
too small, and not too big.

-------------- next part --------------
#! /bin/sh
# Compute the free memory on a Linux system with /proc/meminfo.
# Set GHCRTS accordingly.
# Source this file or pass it a command to run in the extended environment.

GHCRTS=-M`awk '
/^MemFree:/ { free += $2 }
/^Buffers:/ { free += $2 }
/^Cached:/  { free += $2 }
END         { print free "k" }' /proc/meminfo`
export GHCRTS
if [ -n "$1" ]

More information about the Haskell-Cafe mailing list