Compiling large source files
Simon Marlow
marlowsd at gmail.com
Tue Aug 4 09:22:38 EDT 2009
On 04/08/2009 13:30, Serge D. Mechveliani wrote:
> On Tue, Aug 04, 2009 at 09:12:37AM +0100, Simon Marlow wrote:
>> I suggest not using Haskell for your list. Put the data in a file and
>> read it at runtime, or put it in a static C array and link it in.
>>
>> On 03/08/2009 22:09, G?nther Schmidt wrote:
>>> Hi Thomas,
>>> yes, a source file with a single literal list with 85k elements.
>
>
> People,
> when a program only defines and returns a String constant of n
> literals, how much memory needs ghc-6.10.4 to compile it ?
> O(n), or may be O(n^2), or ...
Certainly not O(n^2), we would class that as a bug. I can imagine it
might be worse than linear however, if not in memory at least in time.
Here's some very rough reasoning: GHC has to maintain various kinds of
mappings to do its work. A mapping from variables is O(logn) to look up
in, and we have to do at least O(n) lookups since there are O(n)
variables, so compilation will be O(nlogn).
We should remember that typechecking is worst-case exponential in space
and time, though that is rarely an issue in practice.
String constants are a special case because they are stored internally
as flat arrays of bytes, so you'll probably get closer to O(n) with a
much better constant factor.
Cheers,
Simon
More information about the Glasgow-haskell-users
mailing list