[Haskell-cafe] Re: Data constructors versus types
Peter Verswyvelen
bf3 at telenet.be
Thu Jan 17 02:32:58 EST 2008
> ghc --ddump-simpl and assure that your values get unboxed...
I was not really talking about boxed/unboxed values, that's another
issue I think.
What I was talking about is more related to the work of Neil Mitchell
and Colin Runciman in their static checker for pattern matching
http://www-users.cs.york.ac.uk/~ndm/downloads/paper-unfailing_haskell_a_static_checker_for_pattern_matching-24_sep_2005.pdf
For example, if we would have a language that only knows "bits" as
datatype, together with data constructors (tagging the bits):
data Number = Int Bits#32
| Float Bits#32
(Int x) + (Int y) = Int (primAddInt32 x y)
(Float x) + (Float y) = Int (primAddFloat32 x y)
etc
(1) Would a sufficiently smart compiler be able to eliminate the tagging
overhead at runtime? Answer: yes?
(2) Would such a language be just as statically safe as Haskell? Answer:
I guess so?
(3) What would the complexity of the compiler given a code size? of n?
Answer: O(???)
(4) Would it be possible to create an incremental compiler that would
reduce the complexity from say quadratic to linear or constant? On very
old computers I worked with great incremental C assemblers and linkers,
where the time needed to recompile/relink was mostly proportional to the
amount of changes one did. I guess current compilers are so complex that
making them incremental would be insane?
Thank you,
Peter
>
> > You're talking about O(big)... But wasn't toxbboxhe C language in some way
> > succesful because on the hardware at that time other much nicer
> > languages (e.g. LISP) were just way too slow? Or was this just O(n)
> > times slower?
> >
> Compiler technology also wasn't as advanced as now, Stalin can't
> compile even small programs under say 5 minutes... compare this to e.g.
> TurboPascal, which afair uses three passes: Parsing, Error Reporting,
> Code Generation, it was similar with C compilers back then.
>
> Lisp was fast on lisp machines, where it is the same as what C is to
> Neumann-Architectures: An assembler.
>
> I'm not at all sure about the specific O's involved, but I guess it's
> quite easy to get to NP-complete if you want to do really much without
> much information.
>
>
> > IMHO: Shouldn't concepts that are conceptually the same (in this case,
> > "giving meaning/adding constraints to bits of data" ) at runtime and
> > compile time look very similar in the language? Most languages require
> > completely different syntax and code when you want something to be
> > lazy versus strict. Haskell doesn't, you can just add an annotation
> > if you want it to be strict, no much rewriting is required. However,
> > if I want to change a runtime data constructor definition (and code)
> > into a compiletime type, then I can rewrite all of my code basically.
> > That is not the case in SCHEME as far as I understand it.
> >
> Scheme knows no types but the builtins like INT or PAIR or LIST or
> SYMBOL and so on. Even if you distinguish say
>
> ('complex 1 2)
> from
> ('vec3 1 2 3)
>
> , the compiler in general won't stop you from passing these things into
> the wrong functions. It doesn't even know that a function is passed a
> "LIST (QUOTEDSYMBOL INT INT)" or "LIST (QUOTEDSYMBOL INT INT INT)", it
> just sees a pair, in both cases.
>
> Lisp is actually not really meant to be compiled, but interpreted. The
> nice thing is that it doesn't need more than a handful of primitives, a
> list parser and heap manager/garbage collector and evaluator, which all
> can be implemented in under 1000 lines of C. Things get more involved
> with get/cc, but then how many C programmers ever heard of setjmp...
>
> on top of my head, one set of possible primitives is
>
> quote lambda set! succ pred cond.
>
> you can then start by defining define by
>
> (set! 'define (lambda (sym f) (set! sym f)))
>
> There's the wizard book and Graham's Ansi Common Lisp if you're
> interested in how cheap lisp actually is.
>
More information about the Haskell-Cafe
mailing list