Bayesian Network Classifier for Medical Data Analysis
AbstractBayesian networks encode causal relations between variables using probability and graph theory. They can be used both for prediction of an outcome and interpretation of predictions based on the encoded causal relations. In this paper we analyse a tree-like Bayesian network learning algorithm optimised for classification of data and we give solutions to the interpretation and analysis of predictions. The classification of logical – i.e. binary – data arises specifically in the field of medical diagnosis, where we have to predict the survival chance based on different types of medical observations or we must select the most relevant cause corresponding again to a given patient record.Surgery survival prediction was examined with the algorithm. Bypass surgery survival chance must be computed for a given patient, having a data-set of 66 medical examinations for 313 patients.
 Zs. Csizmadia, B. Vizvári. Methods for the analysis of large real-valued medical databases by logical analysis of data. Rutcor Research Report RRR 42-2004, Rutgers Center for Operations Research, Rutgers University, 2004.
 Judea Pearl. Causality: Modeling, Reasoning, and Inference. Cambridge University Press, Cambridge, 2000.
 Beáta Reiz, Lehel Csató. Tree-like bayesian network classifiers for surgery survival chance prediction. In Proceedings of International Conference on Computers, Communications and Control, Vol. III, pp. 470-474, 2008.
 Kevin P. Murphy. Learning bayes net structure from sparse data sets. Technical report, Comp. Sci. Div., UC Berkeley, 2001.
 Jie Cheng, David A. Bell, and Weiru Liu. An algorithm for bayesian belief network construction from data, 1997.
 Jie Cheng, David A. Bell, and Weiru Liu. Learning belief networks from data: An information theory based approach. In CIKM, pages 325–331, 1997.
 Mieczyslaw A. Klopotek. Mining bayesian network structure for large sets of variables. In ISMIS, pages 114–122, 2002.
 Thomas M. Cover and Joy A. Thomas. Elements of information theory. Wiley-Interscience, New York, NY, USA, 1991.
 Nir Friedman, Dan Geiger, and Moises Goldszmidt. Bayesian network classifiers. Machine Learning, 29(2-3):131–163, 1997.
 David Heckerman and Christopher Meek. Models and selection criteria for regression and classification. Technical Report MSR-TR-97-08, Microsoft Research, 1997.
 F. Fleuret. Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research, 5:1531–1555, November 2004.
 C. Chow and C. Liu. Approximating discrete probability distributions with dependence trees. Information Theory, IEEE Transactions on, 14(3):462–467, 1968.
 Christian P. Robert and George Casella. Monte Carlo Statistical Methods (Springer Texts in Statistics). Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2005.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
ONLINE OPEN ACCES: Acces to full text of each article and each issue are allowed for free in respect of Attribution-NonCommercial 4.0 International (CC BY-NC 4.0.
You are free to:
-Share: copy and redistribute the material in any medium or format;
-Adapt: remix, transform, and build upon the material.
The licensor cannot revoke these freedoms as long as you follow the license terms.
DISCLAIMER: The author(s) of each article appearing in International Journal of Computers Communications & Control is/are solely responsible for the content thereof; the publication of an article shall not constitute or be deemed to constitute any representation by the Editors or Agora University Press that the data presented therein are original, correct or sufficient to support the conclusions reached or that the experiment design or methodology is adequate.