A note on how Rényi entropy can create a spectrum of probabilistic merging operators
Kybernetika, Tome 55 (2019) no. 4, pp. 605-617
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions is still ongoing. The presented result provides a perspective on this discussion. Furthermore, for those who prefer the standard minimisation based on the squared Euclidean distance, it provides a connection to a probabilistic merging operator based on the Kullback-Leibler divergence, which is closely connected to the Shannon entropy.
In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions is still ongoing. The presented result provides a perspective on this discussion. Furthermore, for those who prefer the standard minimisation based on the squared Euclidean distance, it provides a connection to a probabilistic merging operator based on the Kullback-Leibler divergence, which is closely connected to the Shannon entropy.
DOI : 10.14736/kyb-2019-4-0605
Classification : 52A99, 52C99
Keywords: probabilistic merging; information geometry; Kullback–Leibler divergence; Rényi entropy
@article{10_14736_kyb_2019_4_0605,
     author = {Adam\v{c}{\'\i}k, Martin},
     title = {A note on how {R\'enyi} entropy can create a spectrum of probabilistic merging operators},
     journal = {Kybernetika},
     pages = {605--617},
     year = {2019},
     volume = {55},
     number = {4},
     doi = {10.14736/kyb-2019-4-0605},
     mrnumber = {4043538},
     zbl = {07177906},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2019-4-0605/}
}
TY  - JOUR
AU  - Adamčík, Martin
TI  - A note on how Rényi entropy can create a spectrum of probabilistic merging operators
JO  - Kybernetika
PY  - 2019
SP  - 605
EP  - 617
VL  - 55
IS  - 4
UR  - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2019-4-0605/
DO  - 10.14736/kyb-2019-4-0605
LA  - en
ID  - 10_14736_kyb_2019_4_0605
ER  - 
%0 Journal Article
%A Adamčík, Martin
%T A note on how Rényi entropy can create a spectrum of probabilistic merging operators
%J Kybernetika
%D 2019
%P 605-617
%V 55
%N 4
%U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2019-4-0605/
%R 10.14736/kyb-2019-4-0605
%G en
%F 10_14736_kyb_2019_4_0605
Adamčík, Martin. A note on how Rényi entropy can create a spectrum of probabilistic merging operators. Kybernetika, Tome 55 (2019) no. 4, pp. 605-617. doi: 10.14736/kyb-2019-4-0605

[1] Adamčík, M.: The information geometry of Bregman divergences and some applications in multi-expert reasoning. Entropy 16 (2014), 6338-6381. | DOI | MR

[2] Adamčík, M.: Collective Reasoning under Uncertainty and Inconsistency.

[3] Adamčík, M.: On the applicability of the ‘number of possible states’ argument in multi-expert reasoning. J. Appl. Logic 19 (2016), 20-49. | DOI | MR

[4] Adamčík, M.: A logician's approach to meta-analysis with unexplained heterogeneity. J. Biomed. Inform. 71 (2017), 110-129. | DOI

[5] Adamčík, M., Wilmers, G. M.: Probabilistic merging operators. Logique Analyse 228 (2014), 563-590. | DOI | MR

[6] Amari, S., Cichocki, A.: Families of Alpha- Beta- and Gamma- divergences: Flexible and robust measures of similarities. Entropy 12 (2010), 1532-1568. | DOI | MR

[7] Basu, A., Harris, I. R., Hjort, N., Jones, M.: Robust and efficient estimation by minimising a density power divergence. Biometrika 85 (1998), 549-559. | DOI | MR

[8] Bregman, L. M.: The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. USSR Comput. Mathematics Math. Physics 1 (1967), 200-217. | DOI | MR

[9] Hawes, P.: Investigation of Properties of Some Inference Processes.

[10] Jaynes, E. T.: Where do we stand on maximum entropy?. In: The Maximum Entropy Formalism (R. D. Levine, M. Tribus, eds.), M.I.T. Press, 1979, pp. 15-118. | MR

[11] Kern-Isberner, G., Rödder, W.: Belief revision and information fusion on optimum entropy. Int. J. Intell. Systems 19 (2004), 837-857. | DOI | Zbl

[12] Osherson, D., Vardi, M.: Aggregating disparate estimates of chance. Games Econom. Behavior 56 (2006), 148-173. | DOI | MR | Zbl

[13] Paris, J. B.: The Uncertain Reasoner Companion. Cambridge University Press, Cambridge 1994. | MR

[14] Paris, J. B., Vencovská, A.: On the applicability of maximum entropy to inexact reasoning. Int. J. Approx. Reason. 3 (1989), 1-34. | DOI | MR | Zbl

[15] Paris, J. B., Vencovská, A.: A note on the inevitability of maximum entropy. Int. J. Approx. Reason. 4 (1990), 183-224. | DOI | MR

[16] Predd, J. B., Osherson, D. N., Kulkarni, S. R, Poor, H. V.: Aggregating probabilistic forecasts from incoherent and abstaining experts. Decision Analysis 5 (2008), 177-189. | DOI

[17] Rényi, A.: On measures of entropy and information. In: Proc. Fourth Berkeley Symposium on Mathematics, Statistics and Probability 1 (1961), 547-561. | MR | Zbl

[18] Shannon, C. E.: A mathematical theory of communication. Bell System Techn. J. 27 (1948), 379-423, 623-656. | DOI | MR | Zbl

[19] Wilmers, G. M.: A foundational approach to generalising the maximum entropy inference process to the multi-agent context. Entropy 17 (2015), 594-645. | DOI | MR

Cité par Sources :