[Haskell-cafe] GHC 7.0.1 developer challenges

Ketil Malde ketil at malde.org
Thu Dec 16 10:45:10 CET 2010

Simon Marlow <marlowsd at gmail.com> writes:

> ulimit is a good way to catch an infinite loop.  But it's not a good
> way to tell GHC how much memory you want to use - if GHC knows the
> memory limit, then it can make much more intelligent decisions about
> how to manage memory.  

I'm interpreting this to mean that GHC doesn't know the ulimit limit?
It seems to me that GHC should check this, and adjust its heap limit

> The -M flag causes the GC algorithm to switch from copying (fast but
> hungry) to compaction (slow but frugal) as the limit approaches.

In absence of any explicit limits, I think a sensible default is to set
maximum total memory use to something like 80%-90% of physical RAM.
I've yet to see a Haskell program using more than physical RAM without
driving performance (of the system, not just the program) into the

The downside of using ulimit is that it's a bit complicated, not very
portable, and IIRC it's not entirely obvious which option does what.  So
some good defaults would be nice.

If I haven't seen further, it is by standing in the footprints of giants

More information about the Haskell-Cafe mailing list