04/18/2002
15-462 Graphics I
30
Exploiting Coding Redundancy
•Not limited to images (text, other digital info)
•Exploit nonuniform probabilities of symbols
•Entropy as measure of information content
–H = -Si Prob(si) log2 (Prob(si))
–If source is independent random variable need H bits
•Idea:
–More frequent symbols get shorter code strings
–Best with high redundancy (= low entropy)
•Common algorithms
–Huffman coding
–LZW coding (gzip)