[Haskell] Re: Compilation of big, static tables
atilaromero at yahoo.com.br
Thu Feb 23 11:49:27 EST 2006
Malcolm Wallace wrote:
>Hmm, that only works if the data being stored in the table is of regular
>size? I need variable-length values. Using a two- or three-level
>encoding into strings would start to get /really/ unpleasant.
>And this is pretty ghc-specific stuff.
>Perhaps I should propose a new syntactic construct for the language?
And how about create a 'reduction rules' syntax? It maps a common
construct that usually becomes slow code in predefined fast code. So the
optimization done by compiler becomes more transparent, easy to ennhance
and the user can include his own reduction rules.
In this especific case, the rule could be something like 'if the
function returns an array of numbers and the function dont depends on
other functions, the function is a literal array'.
This could be used in IO too, to teach the compiler how to transform a
slow code in a fast code.
>(Although not necessarily in time for haskell-prime.)
> literalArray -> « literal_0, literal_1, ... , literal_n »
>The type of the literal array would be something like
> « » :: Ix i => Array i t
>Could the array be used at /any/ index type i? Or would it be a fixed
>index type, selected by type signature, inference from the usage
>context, or failing either of those, the monomorphism restriction?
>In any case, the actual bounds would be calculated by the compiler.
>What about the content type t? Should it be a fixed type, or would we
>permit overloading? Come to think of it, why should we permit only
>literals as content elements? Arrays are lazy, so the contents could be
>arbitrary expressions, provided they are all of the same type.
> literalArray -> « exp_0, exp_1, ... , exp_n »
>Would we allow the « » brackets to be used for pattern-matching as well
>Haskell mailing list
>Haskell at haskell.org
More information about the Haskell