A Classroom Note: Entropy, Information, and Markov Property
The Teaching of Mathematics, IX (2006) no. 1, p. 1
How to introduce the concept of the Markov Property in an
elementary Probability Theory course? From this author's teaching
experience, it appears that the best way that gives a natural
intuitive flavor and preserves the mathematical rigor, is to use
concepts of entropy and information from the
classical Shannon Information Theory, as suggested in the
brilliant monograph of A. R�nyi [5]. Following this path, the
connection between Entropy and Markov Property is presented.
Classification :
97D40 60J10 94A15 K65
Keywords: Random variable, Independence, Entropy, Information, Conditional Probability, Sufficient Function, Markov Chain.
Keywords: Random variable, Independence, Entropy, Information, Conditional Probability, Sufficient Function, Markov Chain.
@article{TM2_2006_IX_1_a0,
author = {Zoran R. Pop-Stojanovi\'c},
title = {A {Classroom} {Note:} {Entropy,} {Information,} and {Markov} {Property}},
journal = {The Teaching of Mathematics},
pages = {1 },
year = {2006},
volume = {IX},
number = {1},
language = {en},
url = {http://geodesic.mathdoc.fr/item/TM2_2006_IX_1_a0/}
}
Zoran R. Pop-Stojanović. A Classroom Note: Entropy, Information, and Markov Property. The Teaching of Mathematics, IX (2006) no. 1, p. 1 . http://geodesic.mathdoc.fr/item/TM2_2006_IX_1_a0/