Asymptotically Unbiased Estimator of the Informational Energy with kNN

Authors

  • Angel Caţaron Electronics and Computers Department Transylvania University of BraÅŸov, Romania
  • Răzvan Andonie Computer Science Department Central Washington University, Ellensburg, USA
  • Yvonne Chueh Department of Mathematics Central Washington University, Ellensburg, USA

Keywords:

machine learning, statistical inference, asymptotically unbiased estimator, k-th nearest neighbor, informational energy

Abstract

Motivated by machine learning applications (e.g., classification, function approximation, feature extraction), in previous work, we have introduced a nonparametric estimator of Onicescu’s informational energy. Our method was based on the k-th nearest neighbor distances between the n sample points, where k is a fixed positive integer. In the present contribution, we discuss mathematical properties of this estimator. We show that our estimator is asymptotically unbiased and consistent. We provide further experimental results which illustrate the convergence of the estimator for standard distributions.

References

Andonie, R; Petrescu, F.; Interacting systems and informational energy, Foundation of Control Engineering, 11:53-59, 1986.

Andonie, R.; Caţaron, A.; An informational energy LVQ approach for feature ranking, Proc. of the European Symposium on Artificial Neural Networks ESANN 2004, Bruges, Belgium, April 28-30, 2004, D-side Publications, 471-476, 2004.

Andonie, R.; How to learn from small training sets, Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano, Switzerland, September, invited talk, 2009.

Bonachela, J.A.; Hinrichsen, H.; Munoz, M.A.; Entropy estimates of small data sets, J. Phys. A: Math. Theor., 41:202001, 2008. http://dx.doi.org/10.1088/1751-8113/41/20/202001

Caţaron, A.; Andonie, R.; Energy generalized LVQ with relevance factors, Proc. of the IEEE International Joint Conference on Neural Networks IJCNN 2004, Budapest, Hungary, July 26-29, 2004, ISSN 1098-7576, 1421-1426, 2004.

Caţaron, A.; Andonie, R.; Informational energy kernel for LVQ, Proc. of the 15th Int. Conf. on Artificial Neural Networks ICANN 2005, Warsaw, Poland, September 12-14, 2005, W. Duch et al. (Eds.): Lecture Notes in Computer Science 3697, Springer-Verlag Berlin Heidelberg, 601-606, 2005.

Caţaron, A.; Andonie, R.; Energy supervised relevance neural gas for feature ranking, Neural Processing Letters, 1(32):59-73, 2010. http://dx.doi.org/10.1007/s11063-010-9143-z

Caţaron, A.; Andonie, R.; How to infer the informational energy from small datasets, Proc. of the Optimization of 13th International Conference on Electrical and Electronic Equipment (OPTIM2012), Brasov, Romania, May 24-26, 1065-1070, 2012.

Faivishevsky, L.; Goldberger, J.; ICA based on a smooth estimation of the differential entropy, Proc. of the Neural Information Processing Systems, NIPS 2008.

Gamez, J.E.; Modave, F.; Kosheleva, O.; Selecting the most representative sample is NPhard: Need for expert (fuzzy) knowledge, Proc. of the IEEE World Congress on Computational Intelligence WCCI 2008, Hong Kong, China, June 1-6, 1069-1074, 2008.

Guiasu, S.; Information theory with applications, McGraw Hill, New York, 1977.

Hogg, R.V.; Introduction to mathematical statistics, 6/E, Pearson Education, ISBN 9788177589306, 2006.

Kraskov, A.; Stögbauer, H.; Grassberger, P.; Estimating mutual information, Phys. Rev. E, American Physical Society, 6(69):1-16, 2004.

Kozachenko, L. F.; Leonenko, N. N.; Sample estimate of the entropy of a random vector, Probl. Peredachi Inf., 2(23):9-16, 1987.

Lohr, H.; Sampling: Design and analysis, Duxbury Press, 1999.

Miller, M.; Miller M.; John E. Freund's mathematical statistics with applications, Pearson- /Prentice Hall, Upper Saddle River, New Jersey, 2004.

Onicescu, O.; Theorie de l'information. Energie informationelle, C. R. Acad. Sci. Paris, Ser. A-B, 263:841-842, 1966.

Paninski, L.; Estimation of entropy and mutual information, Neural Comput., MIT Press, Cambridge, MA, USA, ISSN 0899-7667, 6(15):1191-1253, 2003.

Principe, J. C.;Xu, D.;Fisher, J. W. III.; Information-theoretic learning, Unsupervised adaptive filtering, ed. Simon Haykin, Wiley, New York, 2000.

Silverman, B.W.; Density Estimation for statistics and data analysis (Chapman & Hall/CRC Monographs on statistics & Applied Probability), Chapman and Hall/CRC, 1986. http://dx.doi.org/10.1007/978-1-4899-3324-9

Singh, H.; Misra, N.; Hnizdo, V.; Fedorowicz, A.; Demchuk, E.; Nearest neighbor estimates of entropy, American Journal of Mathematical and Management Sciences, 23:301-321, 2003. http://dx.doi.org/10.1080/01966324.2003.10737616

Walters-Williams, J.; Li, Y.; Estimation of mutual information: A survey, Proc. of the 4th International Conference on Rough Sets and Knowledge Technology, RSKT 2009, Gold Coast, Australia, July 14-16, 2009, Springer-Verlag, Berlin, Heidelberg, 389-396, 2009.

Wang, Q.; Kulkarni, S. R.; Verdu, S. (2006); A nearest-neighbor approach to estimating divergence between continuous random vectors, Proc. of the IEEE International Symposium on Information Theory, ISIT 2006, Seattle, WA, USA, July 9-14, 2006, 242-246, 2006.

Published

2013-09-17

Most read articles by the same author(s)

Obs.: This plugin requires at least one statistics/report plugin to be enabled. If your statistics plugins provide more than one metric then please also select a main metric on the admin's site settings page and/or on the journal manager's settings pages.