garbage collection and other newbie's issues
mailing_list at istitutocolli.org
Fri Oct 20 06:23:16 EDT 2006
On Thu, Oct 19, 2006 at 06:23:35PM +0400, Bulat Ziganshin wrote:
> Hello Andrea,
> Wednesday, October 18, 2006, 9:34:28 PM, you wrote:
> > solution? Or just some hints on the kind of problem I'm facing: is it
> > related to strictness/laziness, or it's just that I did not understand
> > a single bit of how garbage collection works in Haskell?
> i think, the second. unfortunately, i don't know good introduction to
> the actual GC implementation although i can give you a pair of
> not-so-friendly references:
> shortly speaking, memory allocated by GHC never shrinks.
well, you gave me a wonderfully clear introduction to Haskell GC, and
now I have a better understanding of the output of the various
profiling I'm doing. Thank you very much!
Still, I cannot understand my specific problem, that is to say, why
the function that reads a file retains so much memory.
I did some test and the results are puzzling:
- I tried reading the feed and directly converting it into the opml
chunk to be inserted into the opml component of my StateT monad. The
problem becomes far worse. Here the output of a heap profile:
as you can see, after opening one feed (397868 bytes), closing it, opening another
one (410052 bytes), closing it and reopening the first one brings
memory consumption to 152 Mega.
Using the intermediate datatype (that is to say, reading the feed,
transforming it into my datatype and then to the opml tree), reduces
only 92 Mega of memory consumption for the very same operations.
Making the intermediate datatype strict gives almost the same results:
Now, I come to believe the file reading is indeed strict, and that
my problem could be related to StateT laziness.
Does this makes sense?
I'm now going to try to implement my opml state as a IORef and use a
ReaderT monad to see if something new happens.
> ps: if your program uses a lot if string, FPS will be a very great. it
> don;t change the GC behavior, just makes everything 10 times smaller
yes, but I'm using HXT and this is using normal strings to store xml
text nodes. So I could have some improvements with IO but not that much
in memory consumption, unless I totally change my implementation.
Anyway, even if I could reduce from 152 to 15 mega the memory
consumption for reading 2 feeds, I'd be running out of memory, on my
laptop, in one day instead that 5 minutes. Anyway I should face the
fact that it is not the string implementation in Haskell that is
causing the problem. The problem is probably me!
Thanks for your kind attention.
More information about the Haskell-Cafe