[Haskell-cafe] Yet another binary reader (10x faster than Hackage's binary)
olf at aatal-apotheke.de
Wed Mar 20 19:35:04 UTC 2019
My (admittedly limited) experience with parsers, of which deserializers are a special case, is that with complicated data structures the majority of computing time is spent transforming the raw data (i.e. chunks of bytes) into meaningful information such as Doubles, Timestamps, tree structures and so on. Of course you could argue that such conversions are not the scope of the deserializer and that these operations must be performed regardless which library one chooses. Nevertheless it somewhat relativizes the importance of the choice of serializing library.
Your benchmarks do not go beyond adding integers as far as I can see, which is a relatively cheap operation. I'd be interested in a benchmark where e.g. a huge Set or an array of custom records are used. You seem to work in bioinformatics. There are plenty of examples from this field, e.g. genomic annotations.
More information about the Haskell-Cafe