[Haskell-cafe] Battling laziness

Bulat Ziganshin bulatz at HotPOP.com
Fri Dec 16 09:30:35 EST 2005

Hello Joel,

Friday, December 16, 2005, 2:44:00 PM, you wrote:

JR> I have a huge space leak someplace and I suspect this code. The
JR> SrvServerInfo data structure is something like 50K compressed or  
JR> uncompressed byte data before unpickling. My thousands of bots issue  
JR> this request at least once and I almost run out of memory with 100  
JR> bots on a 1Gb machine on FreeBSD. Do I need deepSeq somewhere below?

1. try to use 3-generations GC. this may greatly help in reducing GC

2. manually add {-# UNPACK #-} to all simple fields (ints, words,
chars). don't use "-f-unbox-strict-fields" because it can unbox whole
structures instead of sharing them

3. in my experience, it's enough to mark all fields in massively used
structures as strict and then eval highest level of such structures
(using "return $! x"). after that the whole structure will be fully
evaluated. but when you use a list, you must either manually eval whole
list (using "return $! length xs") or use DeepSeq, as you suggest,
because lists remain unevaluated depite all these sctrictness annotations

4. you can try to use packed strings or unboxed arrays instead of
lists. in my experience this can greatly reduce GC time just because
this array don't need to be scanned on each GC

5. what is the "uncompress" function here? can i see its code?

6. why EACH bot receives and processes this 50k structure itself?
can't that be done only one time for all?

JR>      do let tables = filter (tableMatches filters) $ activeTables cmd
JR>             ids = map tiTableID tables
JR>                  return $! Eat $! Just $! Custom $! Tables $! ids

here `ids` definitely will be unevaluated, except for the first
element. add "return $! length ids" before the last line

ps: last week i also fight against memory requirements of my own
program. as a result, they was reduced 3-4 times :)

Best regards,
 Bulat                            mailto:bulatz at HotPOP.com

More information about the Haskell-Cafe mailing list