Compiling large source files
simonpj at microsoft.com
Thu Aug 6 03:22:37 EDT 2009
It should be pretty much linear (modulo a log(n) factor, but then log(n) is practically constant (=64 or so).).
But people often report that GHC is slow (perhaps non-linearly so) when compiling vast blobs of literal data. Because there is a reasonable workaround (just parse the data), which actually makes your program more flexible (since you can change the data without recompiling), we have serially failed to look hard at this problem. I wish someone would nail it, though.
| -----Original Message-----
| From: glasgow-haskell-users-bounces at haskell.org [mailto:glasgow-haskell-
| users-bounces at haskell.org] On Behalf Of Serge D. Mechveliani
| Sent: 04 August 2009 13:31
| To: Simon Marlow
| Cc: glasgow-haskell-users at haskell.org
| Subject: Re: Compiling large source files
| On Tue, Aug 04, 2009 at 09:12:37AM +0100, Simon Marlow wrote:
| > I suggest not using Haskell for your list. Put the data in a file and
| > read it at runtime, or put it in a static C array and link it in.
| > On 03/08/2009 22:09, G?nther Schmidt wrote:
| > >Hi Thomas,
| > >yes, a source file with a single literal list with 85k elements.
| when a program only defines and returns a String constant of n
| literals, how much memory needs ghc-6.10.4 to compile it ?
| O(n), or may be O(n^2), or ...
| Serge Mechveliani
| mechvel at botik.ru
| Glasgow-haskell-users mailing list
| Glasgow-haskell-users at haskell.org
More information about the Glasgow-haskell-users