Feature Clustering based MIM for a New Feature Extraction Method

Authors

  • Sabra El Ferchichi University of Tunis EL Manar National Engineering School of Tunis Tunisia, BP 37, LE BELVEDERE 1002, TUNIS
  • Salah Zidi Lille1 University, Science and Technology France,Cité Scientifique, 59655 Villeneuve d’Ascq Cedex
  • Salah Maouche Lille1 University, Science and Technology France,Cité Scientifique, 59655 Villeneuve d’Ascq Cedex
  • Kaouther Laabidi University of Tunis EL Manar, National Engineering School of Tunis, Tunisia
  • Moufida Ksouri University of Tunis EL Manar, National Engineering School of Tunis, Tunisia, BP 37, LE BELVEDERE

Keywords:

feature extraction, Mutual Information Maximization (MIM), similarity measure, clustering

Abstract

In this paper, a new unsupervised Feature Extraction appoach is presented, which is based on feature clustering algorithm. Applying a divisive clustering algorithm, the method search for a compression of the information contained in the original set of features. It investigates the use of Mutual Information Maximization (MIM) to find appropriate transformation of clusterde features. Experiments on UCI datasets show that the proposed method often outperforms conventional unsupervised methods PCA and ICA from the point of view of classification accuracy.

Author Biography

Sabra El Ferchichi, University of Tunis EL Manar National Engineering School of Tunis Tunisia, BP 37, LE BELVEDERE 1002, TUNIS

Department of Mathematics and Computer Science

References

Baker, L.D. and McCallum, A.K.; Distributional clustering of words for text classification, Proc. 21st Ann. Int. ACM SIGIR Conf. on Research and Development in Information Retrieval, 1998. http://dx.doi.org/10.1145/290941.290970

Battiti R.; Using Mutual Information for Selecting Features in Supervised Neural Net Learning, IEEE Trans. on Neural networks, 5: 537-550, 1994. http://dx.doi.org/10.1109/72.298224

Blake, C.L. and Merz C.J.; UCI repository of machine learning databases, http://archive.ics.uci.edu/ml/, Department of Information and Computer Science, University of California, Irvine, CA, 1998.

Bonet, I., Saeys, Y., Grau Abalo, R., García, M., Sanchez, R. and Van de Peer, Y. (2006); Feature extraction using clustering of protein, Proc. 11th Iberoamerican Congress in Pattern Recognition CIARP, eds. Springer, LNCS 4225, 614-623, 2006.

Feature extraction using clustering of protein, Proc. 11th Iberoamerican Congress in Pattern Recognition CIARP, eds. Springer, LNCS 4225, 614-623, 2006.

Charbonnier, S. and Gentil, S.; A trend-based alarm system to improve patient monitoring in intensive care units, Control Engineering Practice, 15:1039-1050, eds. Eds. Elsevier, Kidlington, ROYAUME-UNI, 2007.

Cherkassky, V. and Mulier, F.; Learning from data: concepts, theory and methods, chapter 5, eds.John Wiley & Sons, 1998.

EL Ferchichi, S., Zidi, S., Laabidi, K., Ksouri, M. and Maouche, S.; A new feature extraction method based on clustering for face recognition, " Proc. 12th Engineering Applications of Neural Networks, eds. Springer, IFIP 363, 247-253, 2011.

Fern, X.Z. and Brodley, C.E.; Cluster Ensembles for High Dimensional Clustering: an empirical study, Technical report, CS06-30-02, 2004.

Fisher, J.W., Principe, J.C.; A methodology for information theoretic feature extraction, Proc. 17th Int'l Joint Conf. on Neural Networks, 1998.

Guyon, I., Elisseef, A.; An introduction to variable and feature selection, Journal of Machine Learning Research, 3: 1157-1182, 2003.

Hild II, K.E., Erdogmus, D., Torkkola, K., and Principe, J.C.; Feature extraction using information-theoritic learning, IEEE Trans. on Pattern Analysis and Machine Intelligence, 28, 2006.

Kwak, N., and Choi, C.; Feature extraction based on ICA for binary classification problems, IEEE Trans. on Knowledge and Data Engineering, 15: 1387-1388, 2003.

Kwak, N., Feature selection and extraction based on mutual information for classification; Ph.D Thesis, Seoul National Univ., Seoul, Korea, 2003.

Payne, T.R. and Edwards, P.; Implicit feature selection with the value difference metric, Proc. 13th European Conf. on Artificial Intelligence, 1998.

Saul, L.K., Weinberger, K.Q., Sha, F., Ham, J. and Lee, D.D.; Spectral Methods for Dimensionality Reduction, Semi supervised Learning, eds. MIT Press Cambridge, MA, 2006.

Schaffernicht, E., Kaltenhaeuser, R.; On estimating mutual information for feature selection, Proc. 17th Int'l Conf. on Artificial Neural Networks, eds. Springer, LNCS 6352, 362-367, 2010.

Slonim, N. and Tishby, N.; The power of word clusters for text classification, Proc. 23rd European Colloquim on Information Retrieval Research, 2001.

Suzuki, T., Sugiyama, M., and Kanamori, T.; A Least-squares Approach to Mutual Information Estimation with Application in Variable Selection, JMLR 17th 3rd Workshop on New Challenges for Feature Selection in Data mining and Knowledge Discovery (FSDM 2008), 2008.

Torkkola, K. and Campbell, W.M.; Mutual information in learning feature transformations, Proc. 17th Int'l Conf. on Machine Learning, 2000.

Torkkola, K., Feature extraction by non-parametric mutual information maximization, Journal of Machine Learning Research, 3: 1415-1438, 2003.

Von Luxburg, U., Bubeck, S., Jegelka, S. and Kaufmann, M.; Consistent minimization of clustering objective functions, Neural Information Processing Systems NIPS, 2007.

Published

2013-09-17

Most read articles by the same author(s)

Obs.: This plugin requires at least one statistics/report plugin to be enabled. If your statistics plugins provide more than one metric then please also select a main metric on the admin's site settings page and/or on the journal manager's settings pages.