[Haskell-cafe] Should webassembly be a target for GHC?

Joachim Durchholz jo at durchholz.org
Sat Apr 23 10:08:01 UTC 2016

Am 23.04.2016 um 11:34 schrieb Alexander Kjeldaas:
> On Sat, Apr 23, 2016 at 10:52 AM, Joachim Durchholz <jo at durchholz.org>
> wrote:
>> Finally, any interface libraries that will have to be downloaded from web
>> pages should be small. Less than 100k. Page bloat is a real and serious
>> problem.
> IMO, this particular thing is not a serious problem, and the web pages can
> easily handle two orders of magnitude larger web pages, 10MB javascript
> without problems (example:  Google+).

Well, that's far too much. You have people on UMTS, you have people in 
countries with low wired network bandwidth, and these are all locked out 
if every single page needs a download of 10 MB.

BTW the average web page size is about the same as the Doom install 
image: https://mobiforge.com/research-analysis/the-web-is-doom
That's just ridiculous.

 > If part of the download is a
> standard library, then we'll see more use of cache forever semantics from
> standardized locations (CDNs).

Cache-forever does not happen in practice. Stuff gets upgraded, stuff 
gets evicted from caches.
Particularly if you're entering at the bottom of the Alexa rankings, you 
can't rely on that to fix the problems for you.

 > The stackage eco-system with pre-compiled
> packages can be directly mapped onto a cache forever system and with async
> download of new versions (like what Chrome and other browsers do today),
> the problem is mostly converted from a latency issue into a bandwidth
> issue, and we have plenty of bandwidth.

You have. I have.
Our users do not have.

Also, large libraries tend to be slow - it's not a 1:1 correlation, but 
it is statistically significant.
So if you try to keep code size under control, that's usually a net win.

BTW there's also a latency issue: Larger libraries tend to load slower, 
particularly if the user's persistent storage isn't an SSD but 
spinning-rust technology. 10 MB is latency even if it's not being 

The prevalence of reactions like yours is what has been making web 
slowness part of the problem. 10 years ago, I could open dozens of web 
pages in my browser, and things would be snappy; today, with a machine 
that's ten times as powerful, the browser gets bogged down at a mere 
dozen webpages open in tabs.
This isn't going to become better, Webassembly or not.


More information about the Haskell-Cafe mailing list