k.schupke at imperial.ac.uk
Fri Apr 22 17:16:49 EDT 2005
Glynn Clements wrote:
>Personally, I doubt that Haskell will ever be practical for processing
>very large amounts of data (e.g. larger than your system's RAM).
>When processing large amounts of data, rule #1 is to do as little as
>possible to most of the data; don't even read it into memory if you
>can avoid it (e.g. create an index and use lseek() as much as
>possible). You certainly don't want to be boxing/unboxing gigabytes.
Just like to note that I have found Haskell fine for processing large
datasets, if used in the old batch processing model (and easy to code as
you can use lazy lists to stream the data). Also in the future as 64bit
becomes more widly used, an mmap into an MArray could be an efficient
way to deal with large datasets.
More information about the Libraries