Optimal estimator of hypothesis probability for data mining problems with small samples
International Journal of Applied Mathematics and Computer Science, Tome 22 (2012) no. 3, pp. 629-645.

Voir la notice de l'article provenant de la source Library of Science

The paper presents a new (to the best of the authors' knowledge) estimator of probability called the "[...] completeness estimator" along with a theoretical derivation of its optimality. The estimator is especially suitable for a small number of sample items, which is the feature of many real problems characterized by data insufficiency. The control parameter of the estimator is not assumed in an a priori, subjective way, but was determined on the basis of an optimization criterion (the least absolute errors).The estimator was compared with the universally used frequency estimator of probability and with Cestnik's m-estimator with respect to accuracy. The comparison was realized both theoretically and experimentally. The results show the superiority of the [...] completeness estimator over the frequency estimator for the probability interval ph (0.1, 0.9). The frequency estimator is better for ph [0, 0.1] and ph [0.9, 1].
Keywords: single case problem, probability estimation, frequency interpretation of probability, completeness interpretation of probability, uncertainty theory
Mots-clés : estymacja prawdopodobieństwa, interpretacja częstotliwości prawdopodobieństwa, interpretacja kompletności prawdopodobieństwa, teoria niepewności
@article{IJAMCS_2012_22_3_a10,
     author = {Piegat, A and Landowski, M.},
     title = {Optimal estimator of hypothesis probability for data mining problems with small samples},
     journal = {International Journal of Applied Mathematics and Computer Science},
     pages = {629--645},
     publisher = {mathdoc},
     volume = {22},
     number = {3},
     year = {2012},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/IJAMCS_2012_22_3_a10/}
}
TY  - JOUR
AU  - Piegat, A
AU  - Landowski, M.
TI  - Optimal estimator of hypothesis probability for data mining problems with small samples
JO  - International Journal of Applied Mathematics and Computer Science
PY  - 2012
SP  - 629
EP  - 645
VL  - 22
IS  - 3
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/IJAMCS_2012_22_3_a10/
LA  - en
ID  - IJAMCS_2012_22_3_a10
ER  - 
%0 Journal Article
%A Piegat, A
%A Landowski, M.
%T Optimal estimator of hypothesis probability for data mining problems with small samples
%J International Journal of Applied Mathematics and Computer Science
%D 2012
%P 629-645
%V 22
%N 3
%I mathdoc
%U http://geodesic.mathdoc.fr/item/IJAMCS_2012_22_3_a10/
%G en
%F IJAMCS_2012_22_3_a10
Piegat, A; Landowski, M. Optimal estimator of hypothesis probability for data mining problems with small samples. International Journal of Applied Mathematics and Computer Science, Tome 22 (2012) no. 3, pp. 629-645. http://geodesic.mathdoc.fr/item/IJAMCS_2012_22_3_a10/

[1] Ben-Haim, Y. (2006). Info-gap Decision Theory, Elsevier, Oxford/Amsterdam.

[2] Burdzy, K. (2009). The Search for Certainty. On the Clash of Science and Philosophy of Probability, World Scientific, Singapore.

[3] Burdzy, K. (2011a). Blog on the book The Search for Certainty. On the Clash of Science and Philosophy of Probability, http://search4certainty.blogspot.com/.

[4] Burdzy, K. (2011b). Philosophy of probability, Website, http://www.math.washington.edu/burdzy/philosophy/.

[5] Carnap, R. (1952). Logical Foundations of Probability, University Press, Chicago, IL.

[6] Cestnik, B. (1990). Estimating probabilities: A crucial task in machine learning, in L. Aiello (Ed.), ECAI'90, Pitman, London, pp. 147-149.

[7] Cestnik, B. (1991). Estimating Probabilities in Machine Learning, Ph.D. thesis, Faculty of Computer and Information Science, University of Ljubljana, Ljubljana.

[8] Chernoff, H. (1952). A measure of asymptotic efficiency for test of a hypothesis based on the sum of observations, Annals of Mathematical Statistics 23(4): 493-507.

[9] Cichosz, P. (2000). Learning Systems, Wydawnictwa Naukowo-Techniczne, Warsaw, (in Polish).

[10] Cios, K. and Kurgan, L. (2001). SPECT heart data set, UCI Machine Learning Repository, http://archive.ics.uci.edu/ml/datasets/spect+heart.

[11] De Finetti, B. (1975). Theory of Probability: A Critical Introductory Treatment, Willey, London.

[12] Dubois, D. and Prade, H. (1988). Possibility Theory, Plenum Press, New York/NY, London.

[13] Furnkranz, J. and Flach, P.A. (2005). Roc'n'rule learning: Towards a better understanding of covering algorithms, Machine Learning 58(1): 39-77.

[14] Hajek, A. (2010). Interpretations of probability, in E.N. Zalta, (Ed.), The Stanford Encyclopedia of Philosophy, http://plato.stanford.edu/entries/probability-interpret/.

[15] Khrennikov, A. (1999). Interpretations of Probability, Brill Academic Pub., Utrecht/ Boston, MA.

[16] Klirr, G. J. and Yuan, B. (1996). Fuzzy Sets, Fuzzy Logic, and Fuzzy Systems. Selected Papers by Lotfi Zadeh, World Scientific, Singapore.

[17] Laplace, P. S. (1814, English edition 1951). A Philosophical Essay on Probabilities, Dover Publication, New York/NY.

[18] Larose, D. T. (2010). Discovering Statistics, W.H. Freeman and Company, New York, NY.

[19] Piegat, A. (2011a). Uncertainty of probability, in K. T. Atanassov, M. Baczyński, J. Drewniak, J. Kacprzyk, M. Krawczak, E. Schmidt, M. Wygralak and S. Zadrożny (Eds.) Recent Advances in Fuzzy Sets, Intuitionistic Fuzzy Sets, Generalized Nets and Related Topics, Vol. I: Foundations, IBS PAN, Warsaw, pp. 159-173.

[20] Piegat, A. (2011b). Basic lecture on completeness interpretation of probability, Website, http://kmsiims.wi.zut.edu.pl/pobierz-pliki/cat view/47-publikacje.

[21] Polkowski, L. (2002). Rough Sets, Physica-Verlag, Heidelberg/New York, NY.

[22] Popper, K. R. (1957). The propensity interpretation of the calculus of probability and the quantum theory, in S. Korner (Ed.), Observation and Interpretation: A Symposium of Philosophers and Physicists, Butterworth Scientific Publications, London, pp. 65-70.

[23] Rocchi, P. (2003). The Structural Theory of Probability: New Ideas from Computer Science on the Ancient Problem of Probability Interpretation, Kluwer Academic/Plenum Publishers, New York, NY.

[24] Rokach, L. and Maimon, O. (2008). Data Mining with Decision Trees: Theory and Applications, Machine Perception and Artificial Intelligence, Vol. 69, World Scientific Publishing, Singapore.

[25] Shafer, G. (1976). A Mathematical Theory of Evidence, Princetown University Press, Princetown, NJ .

[26] Siegler, R. S. (1976). Three aspects of cognitive development, Cognitive Psychology 8(4): 481-520.

[27] Siegler, R. S. (1994). Balance scale weight distance database, UCI Machine Learning Repository, http://archive.ics.uci.edu/ml/datasets/balance+scale.

[28] Sulzmann, J. N. and Furnkranz, J. (2009). An empirical comparison of probability estimation techniques for probabilistic rules, in J. Gama, J. Santos Costa, A. M. Jorge and P. Brazdil (Eds.), Proceedings of the 12th International Conference on Discovery Science (DS-09), Springer-Verlag, Heidelberg/New York, NY, pp. 317-331.

[29] Sulzmann, J. N. and Furnkranz, J. (2010). Probability estimation and aggregation for rule learning, Technical Report TUDKE-201-03, Knowledge Engineering Group, TU Darmstadt, Darmstadt.

[30] von Mises, R. (1957). Probability, Statistics and the Truth, Macmillan, Dover/New York, NY.

[31] Witten, I. H. and Frank, E. (2005). Data Mining, Elsevier, Amsterdam.

[32] Zadeh, L. A. (1965). Fuzzy sets, Information and Control 8(3): 338-353.

[33] Ziarko, W. (1999). Decision making with probabilistic decision tables, in N. Zhong (Ed.), New Directions in Rough Sets, Data Mining, and Granular-Soft Computing, Proceedings of the 7th International Workshop, RSFDGrC99, Yamaguchi, Japan, Springer-Verlag, Berlin/Heidelberg, New York, NY, pp. 463-471.