[Haskell] Realistic max size of GHC heap
S. Alexander Jacobson
alex at alexjacobson.com
Thu Sep 15 09:48:03 EDT 2005
Should one interpret this as GHC now targets 64-bit systems or does
one need to employ some sort of clevernesss to use this much memory?
(I posted this question a while ago and was told that GHC did not at
that time support 64-bit so could not use that much memory)
On a related note, does GHC now distribute IO threads over multiple
CPUs or is it still a 1 CPU system?
S. Alexander Jacobson tel:917-770-6565 http://alexjacobson.com
On Thu, 15 Sep 2005, Simon Marlow wrote:
> On 15 September 2005 01:04, Karl Grapone wrote:
>> I'm considering using haskell for a system that could, potentially,
>> need 5GB-10GB of live data.
>> My intention is to use GHC on Opteron boxes which will give me a max
>> of 16GB-32GB of real ram. I gather that GHC is close to being ported
>> to amd64.
>> Is it a realistic goal to operate with a heap size this large in GHC?
>> The great majority of this data will be very long tenured, so I'm
>> hoping that it'll be possible to configure the GC to not need to much
>> peak memory during the collection phase.
> It'll be a good stress test for the GC, at least. There are no reasons
> in principle why you can't have a heap this big, but major collections
> are going to take a long time. It sounds like in your case most of this
> data is effectively static, so in fact a major collection will be of
> little use.
> Generational collection tries to deal with this in an adaptive way:
> long-lived data gets traversed less and less often as the program runs,
> as long as you have enough generations. But if the programmer really
> knows that a large chunk of data is going to be live for a long time, it
> would be interesting to see whether this information could be fed back
> in a way that the GC can take advantage of it. I'm sure there must be
> existing techniques for this sort of thing.
> Haskell mailing list
> Haskell at haskell.org
More information about the Haskell