[Haskell-cafe] Toy compression algorithms [was: A very edgy
language]
Bulat Ziganshin
bulat.ziganshin at gmail.com
Thu Jul 12 03:23:45 EDT 2007
Hello ajb,
Thursday, July 12, 2007, 9:25:35 AM, you wrote:
>> what you mean by "flat" and "static" applied to PPM? static PPM models
>> exist - they carry probabilities as separate table very like static
>> Huffman encoding. is "flat" the same as order-0?
> "Static" means that the frequency distribution is fixed. "Flat" means
> that the frequency histogram is flat. (Every code word is predicted to
> occur with the same frequency, resulting in a binary code.)
well, ppm differs from simple order-0 coder in that it uses previous
symbols to calculate probability. lzw differs from order-0 coder with
flat static probabilities in that it encodes a whole word each time.
there is nothing common between ppm and lzw except for this basic
order-0 model, so i don't see any reasons to call lzw a sort of ppm.
it will be more fair to say that lzw is order-0 coder with static flat
probabilities, you agree?
>> can you give a link? i never heard about such algorithm
> http://en.wikipedia.org/wiki/Canonical_Huffman_code
you said about algorithm that efficiently encodes english words using
huffman codes. is such algorithm really exists?
--
Best regards,
Bulat mailto:Bulat.Ziganshin at gmail.com
More information about the Haskell-Cafe
mailing list