Hacker News new | past | comments | ask | show | jobs | submit login

I was under the impression that arithmetic codes are guaranteed to be at least one bit less efficient than Huffman codes per input block. What makes you say they have better compression ratio?

Are you thinking of pre-defined Huffman tables that aren't adapted to the input? Because the latter ought to be as good as it gets.

(I agree with the other benefits. Since arithmetic coding tables are built in a streaming fashion rather than constructing the codebook up front, they are more memory-efficient while working.)






Huffman codes are conceptually isomorphic to arithmetic codes where all probabilities are 2^-k with k integer, so they have an obvious disadvantage due to more inaccurate symbol distribution.

Hopefully k is natural. ;)

Implied because any symbol distribution which probabilities do not sum to 1 is invalid anyway ;-)

Huffman codes are less efficient per symbol since each symbol is a bit string, arithmetic coding effectively smears symbols across bits more finely. Whether you use a dynamic or static probability model is a different issue applying to either coding method. (Emotionally though I prefer Huffman codes, they're just so neat)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: