Keywords: feature selection; branch & bound; sequential search; mixture model
@article{KYB_2007_43_5_a8,
author = {Somol, Petr and Novovi\v{c}ov\'a, Jana and Pudil, Pavel},
title = {Notes on the evolution of feature selection methodology},
journal = {Kybernetika},
pages = {713--730},
year = {2007},
volume = {43},
number = {5},
mrnumber = {2376333},
zbl = {1134.62041},
language = {en},
url = {http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a8/}
}
Somol, Petr; Novovičová, Jana; Pudil, Pavel. Notes on the evolution of feature selection methodology. Kybernetika, Tome 43 (2007) no. 5, pp. 713-730. http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a8/
[1] Das S.: Filters, wrappers and a boosting-based hybrid for feature selection. In: Proc. 18th Internat. Conference Machine Learning, 2001, pp. 74–81
[2] Dash M., Choi K., Scheuermann, P., Liu H.: Feature selection for clustering – a Filter solution. In: Proc. Second Internat. Conference Data Mining, 2002, pp. 15–122
[3] Devijver P. A., Kittler J.: Pattern Recognition: A Statistical Approach. Prentice-Hall, Englewood Cliffs, NJ 1982 | MR | Zbl
[4] Ferri F. J., Pudil P., Hatef, M., Kittler J.: Comparative study of techniques for large-scale feature selection. In: Pattern Recognition in Practice IV (E. S. Gelsema and L. N. Kanal, eds.), Elsevier Science B.V., 1994, pp. 403–413
[5] Fukunaga K.: Introduction to Statistical Pattern Recognition. Academic Press, New York 1990 | MR | Zbl
[6] Graham M. W., Miller D. J.: Unsupervised learning of parsimonious mixtures on large spaces with integrated feature and component selection. IEEE Trans. Signal Process. 54 (2006), 4, 1289–1303
[7] Hodr R., Nikl J., Řeháková B., Veselý, A., Zvárová J.: Possibilities of a prognostic assessment quoad vitam in low birth weight newborns. Acta Facult. Med. Univ. Brunesis 58 (1977), 345–358
[8] Chen X.: An improved branch and bound algorithm for feature selection. Pattern Recognition Lett. 24 (2003), 12, 1925–1933
[9] Jain A. K., Zongker D.: Feature selection: Evaluation, application and small sample performance. IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997), 2, 153–158
[10] Jain A. K., Duin R. P. W., Mao J.: Statistical pattern eecognition: A review. IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000), 2, 4–37
[11] Kohavi R., John G. H.: Wrappers for feature subset selection. Artificial Intelligence 97 (1997), 1–2, 273–324 | Zbl
[12] Kudo M., Sklansky J.: Comparison of algorithms that select features for pattern classifiers. Pattern Recognition 33 (2000), 1, 25–41
[13] Law M. H., Figueiredo M. A. T., Jain A. K.: Simultaneous feature selection and clustering using mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 26 (2004), 1154–1166
[14] Liu H., Yu L.: Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowledge Data Engrg. 17 (2005), 491–502
[15] Mayer H. A., Somol P., Huber, R., Pudil P.: Improving statistical measures of feature subsets by conventional and evolutionary approaches. In: Proc. 3rd IAPR Internat. Workshop on Statistical Techniques in Pattern Recognition, Alicante 2000, pp. 77–81 | Zbl
[16] McKenzie P., Alder M.: Initializing the EM Algorithm for Use in Gaussian Mixture Modelling. University of Western Australia, 1994
[17] McLachlan G. J.: Discriminant Analysis and Statistical Pattern Recognition. Wiley, New York 1992 | MR | Zbl
[18] McLachlan G. J., Peel D.: Finite Mixture Models. Wiley, New York 2000 | MR | Zbl
[19] Murphy P. M., Aha D. W.: UCI Repository of Machine Learning Databases [ftp. ics.uci.edu]. University of California, Depart ment of Information and Computer Science, Irvine 1994
[20] Narendra P. M., Fukunaga K.: A branch and bound algorithm for feature subset selection. IEEE Trans. Computers 26 (1977), 917–922
[21] Novovičová J., Pudil, P., Kittler J.: Divergence based feature selection for multimodal class densities. IEEE Trans. Pattern Anal. Mach. Intell. 18 (1996), 2, 218–223
[22] Novovičová J., Pudil P.: Feature selection and classification by modified model with latent structure. In: Dealing With Complexity: Neural Network Approach, Springer–Verlag, Berlin 1997, pp. 126–140
[23] Pudil P., Novovičová, J., Kittler J.: Floating search methods in feature selection. Pattern Recognition Lett. 15 (1994), 11, 1119–1125
[24] Pudil P., Novovičová, J., Kittler J.: Feature selection based on approximation of class densities by finite mixtures of special type. Pattern Recognition 28 (1995), 1389–1398
[25] Pudil P., Novovičová, J., Kittler J.: Simultaneous learning of decision rules and important attributes for classification problems in image analysis. Image Vision Computing 12 (1994), 193–198
[26] Sardo L., Kittler J.: Model complexity validation for PDF estimation using Gaussian mixtures. In: Proc. 14th Internat. Conference on Pattern Recognition, Vol. 2, 1998, pp. 195–197
[27] Sebban M., Nock R.: A Hybrid filter/wrapper approach of feature selection using information theory. Pattern Recognition 35 (2002), 835–846 | Zbl
[28] Siedlecki W., Sklansky J.: On automatic feature selection. Internat. J. Pattern Recognition Artif. Intell. 2 (1988), 2, 197–220
[29] Somol P., Pudil P., Novovičová, J., Paclík P.: Adaptive floating search methods in feature selection. Pattern Recognition Lett. 20 (1999), 11 – 13, 1157–1163
[30] Somol P., Pudil P.: Oscillating search algorithms for feature selection. In: Proc. 15th IAPR Internat. Conference on Pattern Recognition, 2000, pp. 406–409
[31] Somol P., Pudil P.: Feature Selection Toolbox. Pattern Recognition 35 (2002), 12, 2749–2759 | Zbl
[32] Somol P., Pudil. P., Kittler J.: Fast branch & bound algorithms for optimal feature selection. IEEE Trans. Pattern Anal. Mach. Intell. 26 (2004), 7, 900–912
[33] Somol P., Pudil, P., Grim J.: On prediction mechanisms in fast branch & bound algorithms. In: Lecture Notes in Computer Science 3138, Springer–Verlag, Berlin 2004, pp. 716–724 | Zbl
[34] Somol P., Novovičová, J., Pudil P.: Flexible-hybrid sequential floating search in statistical feature selection. In: Lecture Notes in Computer Science 4109, Springer–Verlag, Berlin 2006, pp. 632–639
[35] Theodoridis S., Koutroumbas K.: Pattern Recognition. Second edition. Academic Press, New York 2003 | Zbl
[36] Wang Z., Yang, J., Li G.: An improved branch & bound algorithm in feature selection. In: Lecture Notes in Computer Science 2639, Springer, Berlin 2003, pp. 549–556 | Zbl
[37] Webb A.: Statistical Pattern Recognition. Second edition. Wiley, New York 2002 | MR | Zbl
[38] Yu B., Yuan B.: A more efficient branch and bound algorithm for feature selection. Pattern Recognition 26 (1993), 883–889
[39] Yu L., Liu H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: Proc. 20th Internat. Conf. Machine Learning, 2003, pp. 856–863
[40] Benda J. Zvárová a J.: Systém programů TIBIS. Ústav hematologie a krevní transfuze, Praha 1975 (in Czech)
[41] Zvárová J., Perez A., Nikl, J., Jiroušek R.: Data reduction in computer-aided medical decision-making. In: MEDINFO 83 (J. H. van Bemmel, M. J. Ball, and O. Wigertz, eds.), North Holland, Amsterdam 1983, pp. 450–453
[42] Zvárová J., Studený M.: Information theoretical approach to constitution and reduction of medical data. Internat. J. Medical Informatics 45 (1997), 1 – 2, 65–74