aeson and dlist in HP 2013.4.0.0

Bardur Arantsson spam at scientician.net
Thu Nov 28 19:19:00 UTC 2013


On 2013-11-28 17:42, Bas van Dijk wrote:
> On 28 November 2013 13:42, Sven Panne <svenpanne at gmail.com> wrote:
>> Just two add my 2c: Given all these new packages which would need to
>> be pulled into the HP just for aeson, let's not include aeson for
>> 2013.4.0.0 and release 2013.4.0.0 soon without the need for lengthy
>> discussions.
> 
> As the proposer for inclusion of aeson in the HP I'm beginning to agree.
> 
> There's another reason I would like to postpone the aeson inclusion: I
> just started working on improving the encoding performance of aeson.
> This requires some significant changes to the API. Therefore I think
> it would be better to see how well this new API works out. If it works
> out, release it as aeson-7 (or aeson-8) and include that release in
> the HP after next. This way we have time to discuss the new
> dependencies and the HP remains stable.
> 

[--snip lots of interesting details--]

You mentioned generating JSON, so I just thought I'd mention that  it
might also be possible to speed up *parsing* hugely, assuming that only
a few fields/values are needed/evaluated.

There's a very interesting paper called

  "Semi-Indexing Semi-Structured Data in Tiny Space"
  (G. Ottaviano, R. Grossi, 2011)

which basically skips the whole "parsing" overhead in favor of only
"scanning" overhead in its approach to parsing -- which seems to compare
very favorably to C/C++ code for parsing JSON. In addition it uses
space-efficient data structures for all intermediate data, so it may
even pay to build a semi-index and then use that to parse even in
one-off situations. (Credit where credit's due: I think it was Edward
Kmett who posted a comment with this reference on Reddit. I think he
mentioned something about pursuing this for Lens in his Copious Spare
Time(TM)?).

I'm not sure how this work could be integrated with Aeson, but I'm
betting somebody out there has good ideas.

Aeson is already extremely good, but let's make it even better! ... and
by "us", I mean "you, dear Haskell community".

Regards,




More information about the Libraries mailing list