Image Coding
Noiseless Coding Theorem for Binary Transmission
Given a code with an alphabet of two symbols and a source A with an alphabet of two symbols, the average length of the code words per source symbol may be made arbitrarily close to the lower bound (entropy) H(A) as desired by encoding sequences of source symbols rather than individual symbols.
The average length L(n) of encoded n sequences is bounded
Difficulty in estimating entropy lies in the fact that pixels are statistically dependent point-to-point, line-to-line, and frame-to-frame.
Computation of entropy requires the symbols to be considered in blocks over which the statistical dependence is negligible.