Fri, 29 Nov 2002 12:02:37 -0000
>I haven't been able to discern any pattern among those experiencing long
>link times so far, except that -export-dynamic flag used by the dynamic
>loader stuff seems to cause the linker to go off into space for a while.
We're still investigating here, but just a quick summary for our own
- nfs doesn't seem to have too drastic effects, even in-memory disks
don't speed things up, time seems to be spend in computation
- on our (admittedly overloaded and dated) main Sun Server, linking
could take some 20 minutes!
- we've found a more modern (and not yet well-utilized;-) Sun server,
bringing the time down to 6 minutes..:-(
(from that, I thought linking might have to be expensive - how naive!-)
- the same program on my rather old 366Mhz PII notebook links in
about 1 minute (I didn't notice that at first, because overall compile
time is longer on my notebook - but that turns out to be caused by
a single generated file, for which the assembler almost chokes; after
all, the notebook "only" has 192Mb memory, and the disk is crammed)
- with the laptop as reference, I'd guess the problem is not ghc's fault
(unless it does things drastically different on cygwin vs solaris?)
- on our Suns, gcc (and hence ghc) seem to use the native linker
- sunsolve lists several linker patches to address problems like
"linker orders of magnitude slower than Gnu's". We seem to have
those patches, but we're checking again..
moral so far: if compilation of big projects takes a long time, it is worth
checking where that time is spend. for the same project, on different
systems, we've got different bottlenecks:
- large (generated) files [all systems]: assembler needs an awful lot
of space (not enough space->compile takes forever)
- network disks: import chasing takes a lot of time
- Suns (?): linking takes too long
will report again if we get better news..
PS. if we get linking times down to what seems possible, incremental
linking would no longer be urgent - we'll see..