HaXml, memory usage and segmentation fault
Dmitry Astapov
adept@umc.com.ua
29 Oct 2001 17:54:49 +0200
JE> I got it to compile with ghc 5.02 using
JE> ghc --make -package lang translate.hs
JE> The compiled version succeeds, but on a large document it uses a *lot*
JE> of memory and starts paging pretty badly.
Exactly. iPIII-800/192M ram died on me swapping when I tried to run
compiled version with 16M big stack and input file with 100000 children in
one node.
>> JE> Try the identity transform 'main = processXmlWith keep' on your sample
>> JE> document and see if that runs out of heap too. If so, there's not
>> JE> much you can do short of replacing the HaXml parser.
I tried this with ghc 5.02, and it run in 20M RAM or so. It could be less,
but at least it runs, and not segfaults as hugs :)
JE> I tried this as well, modifying your program to use an XML parser I
JE> wrote a while ago that has better laziness properties than the HaXML
JE> one. Alas, my parser also suffers from a space leak under Hugs, so
JE> this only deferred the problem. Under ghc/ghci, though, it has modest
JE> memory requirements and runs without paging.
Is it's distribution restricted? Is it possible to get it somwhere, use it,
patch it, etc?
--
Dmitry Astapov //ADEpt E-mail: adept@umc.com.ua
GPG KeyID/fprint: F5D7639D/CA36 E6C4 815D 434D 0498 2B08 7867 4860 F5D7 639D