On Estimates for the Entropy of a Language According to Shannon
Teoriâ veroâtnostej i ee primeneniâ, Tome 9 (1964) no. 1, pp. 154-157
Cet article a éte moissonné depuis la source Math-Net.Ru
C. E. Shannon proposed the upper and lower estimates for the entropy of a language. It is proved in this paper that in order to attain the lower estimate, it is necessary and sufficient that some letters be equally probable, the probability of the rest being equal to zero after any combination $b_i^N$ of $N$ letters such that $p(b_i^{N-1})>0$. In order to attain the upper estimate, it is necessary and sufficient that the probability that the $k$-th letter appears after $b_i^N$ be dependent on $k$ and $N$, and independent of $i$, for a sequence of language letters arranged in descending order as to the probability that they appear after $b_i^N$ (note that the arrangement of letters in this sequence, however, depends on $i$). In the latter result it is supposed that no letter occurs with the probability $=1$, which is true for every real language.
@article{TVP_1964_9_1_a18,
author = {A. P. Sav\v{c}uk},
title = {On {Estimates} for the {Entropy} of {a~Language} {According} to {Shannon}},
journal = {Teori\^a vero\^atnostej i ee primeneni\^a},
pages = {154--157},
year = {1964},
volume = {9},
number = {1},
language = {ru},
url = {http://geodesic.mathdoc.fr/item/TVP_1964_9_1_a18/}
}
A. P. Savčuk. On Estimates for the Entropy of a Language According to Shannon. Teoriâ veroâtnostej i ee primeneniâ, Tome 9 (1964) no. 1, pp. 154-157. http://geodesic.mathdoc.fr/item/TVP_1964_9_1_a18/