Convexity inequalities for estimating generalized conditional entropies from below
Kybernetika, Tome 48 (2012) no. 2, pp. 242-253 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

Generalized entropic functionals are in an active area of research. Hence lower and upper bounds on these functionals are of interest. Lower bounds for estimating Rényi conditional $\alpha$-entropy and two kinds of non-extensive conditional $\alpha$-entropy are obtained. These bounds are expressed in terms of error probability of the standard decision and extend the inequalities known for the regular conditional entropy. The presented inequalities are mainly based on the convexity of some functions. In a certain sense, they are complementary to generalized inequalities of Fano type.
Generalized entropic functionals are in an active area of research. Hence lower and upper bounds on these functionals are of interest. Lower bounds for estimating Rényi conditional $\alpha$-entropy and two kinds of non-extensive conditional $\alpha$-entropy are obtained. These bounds are expressed in terms of error probability of the standard decision and extend the inequalities known for the regular conditional entropy. The presented inequalities are mainly based on the convexity of some functions. In a certain sense, they are complementary to generalized inequalities of Fano type.
Classification : 39B62, 60E15, 62C10, 94E17
Keywords: Rènyi $\alpha $-entropy; non-extensive entropy of degree $\alpha $; error probability; Bayesian problems; functional convexity
@article{KYB_2012_48_2_a4,
     author = {Rastegin, Alexey},
     title = {Convexity inequalities for estimating generalized conditional entropies from below},
     journal = {Kybernetika},
     pages = {242--253},
     year = {2012},
     volume = {48},
     number = {2},
     mrnumber = {2954323},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2012_48_2_a4/}
}
TY  - JOUR
AU  - Rastegin, Alexey
TI  - Convexity inequalities for estimating generalized conditional entropies from below
JO  - Kybernetika
PY  - 2012
SP  - 242
EP  - 253
VL  - 48
IS  - 2
UR  - http://geodesic.mathdoc.fr/item/KYB_2012_48_2_a4/
LA  - en
ID  - KYB_2012_48_2_a4
ER  - 
%0 Journal Article
%A Rastegin, Alexey
%T Convexity inequalities for estimating generalized conditional entropies from below
%J Kybernetika
%D 2012
%P 242-253
%V 48
%N 2
%U http://geodesic.mathdoc.fr/item/KYB_2012_48_2_a4/
%G en
%F KYB_2012_48_2_a4
Rastegin, Alexey. Convexity inequalities for estimating generalized conditional entropies from below. Kybernetika, Tome 48 (2012) no. 2, pp. 242-253. http://geodesic.mathdoc.fr/item/KYB_2012_48_2_a4/

[1] L. Baladová: Minimum of average conditional entropy for given minimum probability of error. Kybernetika 2 (1966), 416-422. | MR | Zbl

[2] T. Cover, J. Thomas: Elements of Information Theory. John Wiley & Sons, New York 1991. | MR | Zbl

[3] I. Csiszár: Axiomatic characterizations of information measures. Entropy 10 (2008), 261-273. | DOI | Zbl

[4] Z. Daróczy: Generalized information functions. Inform. and Control 16 (1970), 36-51. | DOI | MR | Zbl

[5] M. H. DeGroot: Optimal Statistical Decisions. McGraw-Hill, New York 1970. | MR | Zbl

[6] D. Erdogmus, J. C. Principe: Lower and upper bounds for misclassification probability based on Rényi's information. J. VLSI Signal Process. 37 (2004), 305-317. | DOI | Zbl

[7] R. M. Fano: Transmission of Information: A Statistical Theory of Communications. MIT Press and John Wiley & Sons, New York 1961. | MR | Zbl

[8] M. Feder, N. Merhav: Relations between entropy and error probability. IEEE Trans. Inform. Theory 40 (1994), 259-266. | DOI | Zbl

[9] S. Furuichi: Information theoretical properties of Tsallis entropies. J. Math. Phys. 47 (2006), 023302. | DOI | MR | Zbl

[10] M. Gell-Mann, C. Tsallis, eds.: Nonextensive Entropy - Interdisciplinary Applications. Oxford University Press, Oxford 2004. | MR | Zbl

[11] G. H. Hardy, J. E. Littlewood, G. Polya: Inequalities. Cambridge University Press, London 1934. | Zbl

[12] J. Havrda, F. Charvát: Quantification methods of classification processes: concept of structural $\alpha$-entropy. Kybernetika 3 (1967), 30-35. | MR

[13] P. Jizba, T. Arimitsu: The world according to Rényi: thermodynamics of multifractal systems. Ann. Phys. 312 (2004), 17-59. | DOI | MR | Zbl

[14] R. Kamimura: Minimizing $\alpha$-information for generalization and interpretation. Algorithmica 22 (1998), 173-197. | DOI | MR | Zbl

[15] A. Novikov: Optimal sequential procedures with Bayes decision rules. Kybernetika 46 (2010), 754-770. | MR | Zbl

[16] A. Perez: Information-theoretic risk estimates in statistical decision. Kybernetika 3 (1967), 1-21. | MR | Zbl

[17] A. E. Rastegin: Rényi formulation of the entropic uncertainty principle for POVMs. J. Phys. A: Math. Theor. 43 (2010), 155302. | DOI | MR | Zbl

[18] A. E. Rastegin: Entropic uncertainty relations for extremal unravelings of super-operators. J. Phys. A: Math. Theor. 44 (2011), 095303. | DOI | MR | Zbl

[19] A. E. Rastegin: Continuity estimates on the Tsallis relative entropy. E-print arXiv:1102.5154v2 [math-ph] (2011). | MR

[20] A. E. Rastegin: Fano type quantum inequalities in terms of $q$-entropies. Quantum Information Processing (2011), doi 10.1007/s11128-011-0347-6.

[21] A. Rényi: On measures of entropy and information. In: Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1961, pp. 547-561. | MR | Zbl

[22] A. Rényi: On the amount of missing information in a random variable concerning an event. J. Math. Sci. 1 (1966), 30-33. | MR

[23] A. Rényi: Statistics and information theory. Stud. Sci. Math. Hung. 2 (1967), 249-256. | MR | Zbl

[24] A. Rényi: On some basic problems of statistics from the point of view of information theory. In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1967, pp. 531-543. | MR | Zbl

[25] B. Schumacher: Sending entanglement through noisy quantum channels. Phys. Rev. A 54 (1996), 2614-2628. | DOI

[26] C. Tsallis: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52 (1988), 479-487. | DOI | MR | Zbl

[27] I. Vajda: On the statistical decision problem with discrete paprameter space. Kybernetika 3 (1967), 110-126. | MR

[28] I. Vajda: Bounds of the minimal error probability on checking a finite or countable number of hypotheses. Problemy Peredachii Informacii 4 (1968), 9-19 (in Russian); translated as Problems of Information Transmission 4 (1968), 6-14. | MR

[29] K. Życzkowski: Rényi extrapolation of Shannon entropy. Open Sys. Inform. Dyn. 10 (2003), 297-310; corrigendum in the e-print version arXiv:quant-ph/0305062v2. | MR | Zbl