[Haskell-cafe] Storing big datasets
Joachim Durchholz
jo at durchholz.org
Sat May 7 21:11:57 UTC 2016
Am 07.05.2016 um 16:42 schrieb David Turner:
> Thousands of transactions per second is getting into
> needs-clever-optimisation territory if it can't be done in RAM, but it's
> not that tricky. Postgres will batch transactions for you: see for instance
> http://pgeoghegan.blogspot.co.uk/2012/06/towards-14000-write-transactions-on-my.html?m=1
Well, actually it is - that particular guy maintains 14,000 transactions
per second, but with benchmarks that may or may not transfer to
Mikhail's scenario, and with a quite high-end 7,200 RPM HDD which may or
may not fit his definition of standard hardware.
That said, these results are still pretty impressive.
Regards,
Jo
BTW please "reply to list", replying directly and CC'ing to the list
means I have to manually copy/paste the list address to answer.
More information about the Haskell-Cafe
mailing list