Incremental and Decremental SVM for Regression

  • Honorius Gâlmeanu 1. Siemens Corporate Technology honorius.galmeanu@siemens.com 2. Faculty of Mathematics and Informatics Transilvania University of Brasov
  • Lucian Mircea Sasu 1. Faculty of Mathematics and Informatics Transilvania University of Brasov lmsasu@unitbv.ro 2. Siemens Corporate Technology
  • Razvan Andonie 1. Computer Science Department Central Washington University, Ellensburg, USA 2. Electronics and Computers Department Transilvania University of Brasov

Abstract

Training a support vector machine (SVM) for regression (function approximation) in an incremental/decremental way consists essentially in migrating the input vectors in and out of the support vector set with specific modification of the associated thresholds. We introduce with full details such a method, which allows for defining the exact increments or decrements associated with the thresholds before vector migrations take place. Two delicate issues are especially addressed: the variation of the regularization parameter (for tuning the model performance) and the extreme situations where the support vector set becomes empty. We experimentally compare our method with several regression methods: the multilayer perceptron, two standard SVM implementations, and two models based on adaptive resonance theory.

References

[1] Andonie, R.; Sasu, L. (2006); Fuzzy ARTMAP with input relevances, IEEE Transactions n Neural Networks, 17: 929-941.

[2] Carpenter, G.A.; Grossberg, S. (1988); The ART of Adaptive Pattern Recognition by a elf-Organizing Neural Network, IEEE Computer, 77–88.

[3] Cauwenberghs, G.; Poggio, T. (2000); Incremental and Decremental Support Vector Machine earning, Neural Information Processing Systems, 409-415.

[4] Chang, C.C.; Lin, C.J. (2001); LIBSVM: a Library for Support Vector Machines, Software vailable at http://www.csie.ntu.edu.tw/~cjlin/libsvm

[5] Diehl, C.P.; Cauwenberghs, G. (2003); SVM Incremental Learning, Adaptation and Optimization, roceedings of the IJCNN, 4: 2685-2690.

[6] Frank, A., Asuncion, A. (2010); UCI Machine Learning Repository, ttp://archive.ics.uci.edu/ml

[7] Galmeanu, H.; Andonie, R. (2008); Incremental/Decremental SVM for Function Approximation, roceedings of the 11th Intl. Conf. on Optimization of Electrical and Electronic quipment, 2: 155-160.

[8] Golub, G.H.; Van Loan, C.F. (1996); Matrix Computations, JHU Press, Baltimore and ondon, 1996.

[9] Gu, B.; Wang, J.D.; Yu, Y.; Zheng, G.S.; Yu Fan.; Xu, T. (2012); Accurate on-line v-support ector learning, Neural Networks, 27: 51–59.
http://dx.doi.org/10.1016/j.neunet.2011.10.006

[10] Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I. H. (2009); The EKA data mining software: an update, SIGKDD Explorations Newsletter, 11 (1):10–18.

[11] Karasuyama, M.; Takeuchi, I. (2010); Multiple incremental decremental learning of support ector machines, IEEE Transactions on Neural Networks, 21(7): 1048–1059.
http://dx.doi.org/10.1109/TNN.2010.2048039

[12] Laskov, P.; Gehl, C.; Krüger, S.; Müller, K.R. (2006); Incremental Support Vector Learning: nalysis, Implementation and Applications, Journal of Machine Learning Research, 7: 909–1936.

[13] Martin, M. (2002); On-line Support Vector Machines for Function Approximation, Technical eport LSI-02-11-R, Software Department, Universitat Politecnica de Catalunya.

[14] Ma, J.; Thelier, J.; Perkins, S. (2003); Accurate On-line Support Vector Regression, Neural omputation, 15(11): 2683–2703.

[15] Carpenter, G. A.; Grossberg, S.; Markuzon, N.; Reynolds, J. H.; Rosen, D. B., (1992), Fuzzy RTMAP: A neural network architecture for incremental supervised learning of analog ultidimensional maps, IEEE Transactions on Neural Networks, 3: 698-713.
http://dx.doi.org/10.1109/72.159059

[16] Marriott, S.; Harrison, R. F. (1995), A modified fuzzy ARTMAP architecture for the approximation f noisy mappings, Neural Networks, 8(4): 619–641.
http://dx.doi.org/10.1016/0893-6080(94)00110-8

[17] Vigdor, B.; Lerner, B. (2007). The Bayesian ARTMAP, IEEE Transactions on Neural Networks 8: 1628–1644.
http://dx.doi.org/10.1109/TNN.2007.900234

[18] Sasu, L.M.; Andonie, R. (2013); Bayesian ARTMAP for regression, Neural Networks, ISSN 893-6080, 46: 23-31.

[19] Shashua, A. (2009); Introduction to Machine Learning: Class Notes 67577, ttps://arxiv.org/abs/0904.3664v1

[20] Vapnik, V. (1998); Statistical learning theory, New York: Wiley.

[21] Schölkopf, B; Smola, A. J; Williamson, R. C.; Bartlett, P. L. (2000); New support vector lgorithms, Neural computation, 12 (5): 1207–1245.
http://dx.doi.org/10.1162/089976600300015565
Published
2016-10-17
How to Cite
GÂLMEANU, Honorius; SASU, Lucian Mircea; ANDONIE, Razvan. Incremental and Decremental SVM for Regression. INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, [S.l.], v. 11, n. 6, p. 755-775, oct. 2016. ISSN 1841-9844. Available at: <http://univagora.ro/jour/index.php/ijccc/article/view/2744>. Date accessed: 16 july 2020. doi: https://doi.org/10.15837/ijccc.2016.6.2744.

Keywords

support vector machine, incremental and decremental learning, regression, function approximation