For Better Performance Please Use Chrome or Firefox Web Browser

How Suboptimal is the Shannon Code?

Abstract

In order to determine how suboptimal the Shannon code is, one should compare its performance with that of the optimal code, i.e., the corresponding Huffman code, in some sense. It is well known that in the worst case the redundancy of both the Shannon and Huffman codes can be arbitrarily close to 1. Beyond this worst case viewpoint, very little is known. In this paper, we compare the performance of these codes from an average point of view. The redundancy is considered as a random variable on the set of all sources with $n$ symbols and its average is evaluated. It is shown that the average redundancy of the Shannon code is very close to 0.5 bits, whereas the average redundancy of the Huffman code is less than $n^{-1}(1+\ln n) + 0.086$ bits. It is also proven that the variance of the redundancy of the Shannon code tends to zero as $n$ increases. Therefore, for sources with alphabet size $n$, the redundancy of the Shannon code is approximately 0.5 bits with probability approaching 1 as $n$ tends to infinity.

Journal Papers
Month/Season: 
January
Year: 
2013

تحت نظارت وف ایرانی