[Haskell] Realistic max size of GHC heap
Simon Marlow
simonmar at microsoft.com
Thu Sep 15 06:42:44 EDT 2005
On 15 September 2005 01:04, Karl Grapone wrote:
> I'm considering using haskell for a system that could, potentially,
> need 5GB-10GB of live data.
> My intention is to use GHC on Opteron boxes which will give me a max
> of 16GB-32GB of real ram. I gather that GHC is close to being ported
> to amd64.
>
> Is it a realistic goal to operate with a heap size this large in GHC?
> The great majority of this data will be very long tenured, so I'm
> hoping that it'll be possible to configure the GC to not need to much
> peak memory during the collection phase.
It'll be a good stress test for the GC, at least. There are no reasons
in principle why you can't have a heap this big, but major collections
are going to take a long time. It sounds like in your case most of this
data is effectively static, so in fact a major collection will be of
little use.
Generational collection tries to deal with this in an adaptive way:
long-lived data gets traversed less and less often as the program runs,
as long as you have enough generations. But if the programmer really
knows that a large chunk of data is going to be live for a long time, it
would be interesting to see whether this information could be fed back
in a way that the GC can take advantage of it. I'm sure there must be
existing techniques for this sort of thing.
Cheers,
Simon
More information about the Haskell
mailing list