[Haskell-cafe] Re: How much of Haskell was possible 20 years ago?

Brandon S. Allbery KF8NH allbery at ece.cmu.edu
Sun Oct 21 22:02:25 EDT 2007


On Oct 21, 2007, at 21:31 , Maurí cio wrote:

> Anyway, what I would like would be a "theoretical"
> answer. Is there something fundamentally diferent
> between a C compiler and a Haskell one that makes
> the former fits into 30Kb but not the other? If

I am not sure *modern* C would have fit into 30KB.  Back then, C was  
the original K&R C, which was in many ways much lower level than ANSI  
C; those compilers could be quite compact.  (I remember playing with  
SmallC, which compiled a subset of C which covered 95% of what  
features programmers actually used back then and was extremely compact.)

There are two reasons ANSI C would be unlikely to fit:  modern C  
compilers do optimization, which (aside from very simplistic  
"peephole" optimization, needs lots of memory), and the ANSI C type  
system (such as it is) is complex enough to require more memory to  
deal with properly.  In early K&R C (which was still relatively close  
to its untyped predecessor BCPL), many things were "untyped" and  
effectively (int) (a machine word); non-(int) integral types were  
promoted to (int), and (float) was promoted to (double), wherever  
possible to simplify things for the compiler.  You could use more  
complex types to some extent, but they tended to perform poorly  
enough that much effort went into avoiding them.  Even in later K&R C  
(which still did type promotion but deprecated "untyped" functions  
and removed "untyped" variables), people avoided using even (long  
int) except through pointers because they were so expensive to use as  
function arguments and return values (and some compilers, mostly on  
some mainframe architectures, *couldn't* return (long int) safely).

Both of these also apply to Haskell.  In particular, you can do a  
naive compilation of Haskell code but it will perform very poorly ---  
Haskell *needs* a good optimizer.  (Compare the performance of code  
compiled with unregisterised GHC to that compiled with normal, highly  
optimized GHC sometime.)  Additionally, even Haskell98 can need quite  
a bit of memory to do type unification if you don't explicitly  
specify every type everywhere; and this gets much worse with the  
extensions in modern Haskell (in particular, functional dependencies  
complicate type unification).

There may be theory issues which also impact the question, but in  
large part it's not theory so much as practical concerns.

-- 
brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery at kf8nh.com
system administrator [openafs,heimdal,too many hats] allbery at ece.cmu.edu
electrical and computer engineering, carnegie mellon university    KF8NH




More information about the Haskell-Cafe mailing list