Asymptotically Unbiased Estimation of A Nonsymmetric Dependence Measure Applied to Sensor Data Analytics and Financial Time Series

Angel Cațaron, Razvan Andonie, Yvonne Chueh


A fundamental concept frequently applied to statistical machine learning is the detection of dependencies between unknown random variables found from data samples. In previous work, we have introduced a nonparametric unilateral dependence measure based on Onicescu’s information energy and a kNN method for estimating this measure from an available sample set of discrete or continuous variables. This paper provides the formal proofs which show that the estimator is asymptotically unbiased and has asymptotic zero variance when the sample size increases. It implies that the estimator has good statistical qualities. We investigate the performance of the estimator for data analysis applications in sensor data analysis and financial time series.


machine learning, sensor data analytics, financial time series, statistical inference, information energy, nonsymmetric dependence measure, big data analytics

Full Text:



Andonie R., Cațaron A. (2004), An informational energy LVQ approach for feature ranking, European Symposium on Artificial Neural Networks 2004, pages In d-side publications, 471– 476, 2004.

Andonie R. (1986), Interacting systems and informational energy, Foundation of Control Engineering, 11, 53–59, 1986.

Bonachela J.A., Hinrichsen H., Miguel A. Munoz M.A. (2008), Entropy estimates of small data sets, MATH.THEOR., 41(20), 1-20, 2008.

Cațaron A., Andonie R., Chueh Y. (2013), Asymptotically unbiased estimator of the informational energy with kNN, International Journal of Computers Communications & Control, 8(5), 689–698, 2013.

Cațaron A., Andonie R. (2012), How to infer the informational energy from small datasets, Optimization of Electrical and Electronic Equipment (OPTIM), 2012 13th International Conference on, 1065 –1070, 2012.

Cațaron A., Andonie R., Chueh Y. (2014), kNN estimation of the unilateral dependency measure between random variables, 2014 IEEE Symposium on Computational Intelligence and Data Mining, (CIDM 2014), Orlando, FL, USA, 471–478, 2014.

Cațaron A., Andonie R., Chueh Y. (2015), Financial data analysis using the informational energy unilateral dependency measure, Proceedings of the International Joint Conference on Neural Networks, (IJCNN 2015), Killarney, Ireland, 1-8, 2015.

Chueh Y., Caµaron A., Andonie R. (2016), Mortality rate modeling of joint lives and survivor insurance contracts tested by a novel unilateral dependence measure, 2016 IEEE Symposium Series on Computational Intelligence, SSCI 2016, Athens, Greece, December 6-9, 2016, 1–8, 2016.

Faivishevsky L., Goldberger J. (2008), ICA based on a smooth estimation of the differential entropy, NIPS, 1-8, 2008.

Gamez J.E., Modave F., Kosheleva O. (2008), Selecting the most representative sample is NP-hard: Need for expert (fuzzy) knowledge, Fuzzy Systems, 2008. FUZZ-IEEE 2008. (IEEE World Congress on Computational Intelligence). IEEE International Conference on, 1069–1074, 2008.

Guiasu S. (1977), Information theory with applications, McGraw Hill, New York, 1977.

Hogg R.V., McKean J., Allen T. Craig A.T. (2006), Introduction To Mathematical Statistics, 6/E, Pearson Education, 2006.

Kozachenko L. F., Leonenko N. N. (1987), Sample estimate of the entropy of a random vector, Probl. Peredachi Inf., 23(2), 9–16, 1987.

Kraskov A., Stögbauer H., Grassberger P. (2004), Estimating mutual information, Phys. Rev. E, 69, 1–16, 2004.

Li H. (2015), On nonsymmetric nonparametric measures of dependence, arXiv:1502.03850, 2015.

Lohr H. (1999), Sampling: Design and Analysis, Duxbury Press, 1999.

Miller M., Miller M. (2003), John E. Freund's mathematical statistics with applications, Pearson/Prentice Hall, Upper Saddle River, New Jersey, 7th edition, 2003.

Onicescu O. (1966), Theorie de l'information. Energie informationelle, C. R. Acad. Sci. Paris, Ser. A–B, 263, 841–842, 1966.

Paninski L. (2003), Estimation of entropy and mutual information, Neural Comput., 15, 1191–1253, 2003.

Schweizer B., Wolff E. F. (1981), On nonparametric measures of dependence for random variables, Ann. Statist., 9:879–885, 1981.

Silverman B.W. (1986), Density Estimation for Statistics and Data Analysis (Chapman & Hall/CRC Monographs on Statistics & Applied Probability), Chapman and Hall/CRC, 1986.

Singh H., Misra N., Hnizdo V., Fedorowicz A., Demchuk E. (2003), Nearest neightboor estimates of entropy, American Journal of Mathematical and Management Sciences, 23, 301–321, 2003.

Walters-Williams J., Li Y. (2009), Estimation of mutual information: A survey, Proceedings of the 4th International Conference on Rough Sets and Knowledge Technology, Springer- Verlag, Berlin, Heidelberg, 389–396, 2009.

Wang Q., Kulkarni S. R., Verdu S. (2006), A nearest-neighbor approach to estimating divergence between continuous random vectors, Proc. of the IEEE International Symposium on Information Theory, Seattle, WA, 242-246, 2006.


Copyright (c) 2017 Angel Cațaron, Razvan Andonie, Yvonne Chueh

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

CC-BY-NC  License for Website User

Articles published in IJCCC user license are protected by copyright.

Users can access, download, copy, translate the IJCCC articles for non-commercial purposes provided that users, but cannot redistribute, display or adapt:

  • Cite the article using an appropriate bibliographic citation: author(s), article title, journal, volume, issue, page numbers, year of publication, DOI, and the link to the definitive published version on IJCCC website;
  • Maintain the integrity of the IJCCC article;
  • Retain the copyright notices and links to these terms and conditions so it is clear to other users what can and what cannot be done with the  article;
  • Ensure that, for any content in the IJCCC article that is identified as belonging to a third party, any re-use complies with the copyright policies of that third party;
  • Any translations must prominently display the statement: "This is an unofficial translation of an article that appeared in IJCCC. Agora University  has not endorsed this translation."

This is a non commercial license where the use of published articles for commercial purposes is forbiden. 

Commercial purposes include: 

  • Copying or downloading IJCCC articles, or linking to such postings, for further redistribution, sale or licensing, for a fee;
  • Copying, downloading or posting by a site or service that incorporates advertising with such content;
  • The inclusion or incorporation of article content in other works or services (other than normal quotations with an appropriate citation) that is then available for sale or licensing, for a fee;
  • Use of IJCCC articles or article content (other than normal quotations with appropriate citation) by for-profit organizations for promotional purposes, whether for a fee or otherwise;
  • Use for the purposes of monetary reward by means of sale, resale, license, loan, transfer or other form of commercial exploitation;

    The licensor cannot revoke these freedoms as long as you follow the license terms.

[End of CC-BY-NC  License for Website User]

INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL (IJCCC), With Emphasis on the Integration of Three Technologies (C & C & C),  ISSN 1841-9836.

IJCCC was founded in 2006,  at Agora University, by  Ioan DZITAC (Editor-in-Chief),  Florin Gheorghe FILIP (Editor-in-Chief), and  Misu-Jan MANOLESCU (Managing Editor).

Ethics: This journal is a member of, and subscribes to the principles of, the Committee on Publication Ethics (COPE).

Ioan  DZITAC (Editor-in-Chief) at COPE European Seminar, Bruxelles, 2015:

IJCCC is covered/indexed/abstracted in Science Citation Index Expanded (since vol.1(S),  2006); JCR2016: IF=1.374. .

IJCCC is indexed in Scopus from 2008 (CiteScore 2017 = 1.04; SNIP2017 = 0.616, SJR2017 =0.326):

Nomination by Elsevier for Journal Excellence Award Romania 2015 (SNIP2014 = 1.029): Elsevier/ Scopus

IJCCC was nominated by Elsevier for Journal Excellence Award - "Scopus Awards Romania 2015" (SNIP2014 = 1.029).

IJCCC is in Top 3 of 157 Romanian journals indexed by Scopus (in all fields) and No.1 in Computer Science field by Elsevier/ Scopus.


 Impact Factor in JCR2017 (Clarivate Analytics/SCI Expanded/ISI Web of Science): IF=1.29 (Q3). Scopus: CiteScore2017=1.04 (Q2); Editors-in-Chief: Ioan DZITAC & Florin Gheorghe FILIP.