Incremental and Decremental SVM for Regression
Keywords:
support vector machine, incremental and decremental learning, regression, function approximationAbstract
Training a support vector machine (SVM) for regression (function approximation) in an incremental/decremental way consists essentially in migrating the input vectors in and out of the support vector set with specific modification of the associated thresholds. We introduce with full details such a method, which allows for defining the exact increments or decrements associated with the thresholds before vector migrations take place. Two delicate issues are especially addressed: the variation of the regularization parameter (for tuning the model performance) and the extreme situations where the support vector set becomes empty. We experimentally compare our method with several regression methods: the multilayer perceptron, two standard SVM implementations, and two models based on adaptive resonance theory.References
Andonie, R.; Sasu, L. (2006); Fuzzy ARTMAP with input relevances, IEEE Transactions n Neural Networks, 17: 929-941.
Carpenter, G.A.; Grossberg, S. (1988); The ART of Adaptive Pattern Recognition by a elf-Organizing Neural Network, IEEE Computer, 77-88.
Cauwenberghs, G.; Poggio, T. (2000); Incremental and Decremental Support Vector Machine earning, Neural Information Processing Systems, 409-415.
Chang, C.C.; Lin, C.J. (2001); LIBSVM: a Library for Support Vector Machines, Software vailable at http://www.csie.ntu.edu.tw/~cjlin/libsvm
Diehl, C.P.; Cauwenberghs, G. (2003); SVM Incremental Learning, Adaptation and Optimization, roceedings of the IJCNN, 4: 2685-2690.
Frank, A., Asuncion, A. (2010); UCI Machine Learning Repository, ttp://archive.ics.uci.edu/ml
Galmeanu, H.; Andonie, R. (2008); Incremental/Decremental SVM for Function Approximation, roceedings of the 11th Intl. Conf. on Optimization of Electrical and Electronic quipment, 2: 155-160.
Golub, G.H.; Van Loan, C.F. (1996); Matrix Computations, JHU Press, Baltimore and ondon, 1996.
Gu, B.; Wang, J.D.; Yu, Y.; Zheng, G.S.; Yu Fan.; Xu, T. (2012); Accurate on-line v-support ector learning, Neural Networks, 27: 51-59. http://dx.doi.org/10.1016/j.neunet.2011.10.006
Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I. H. (2009); The EKA data mining software: an update, SIGKDD Explorations Newsletter, 11 (1):10-18.
Karasuyama, M.; Takeuchi, I. (2010); Multiple incremental decremental learning of support ector machines, IEEE Transactions on Neural Networks, 21(7): 1048-1059. http://dx.doi.org/10.1109/TNN.2010.2048039
Laskov, P.; Gehl, C.; Krüger, S.; Müller, K.R. (2006); Incremental Support Vector Learning: nalysis, Implementation and Applications, Journal of Machine Learning Research, 7: 909-1936.
Martin, M. (2002); On-line Support Vector Machines for Function Approximation, Technical eport LSI-02-11-R, Software Department, Universitat Politecnica de Catalunya.
Ma, J.; Thelier, J.; Perkins, S. (2003); Accurate On-line Support Vector Regression, Neural omputation, 15(11): 2683-2703.
Carpenter, G. A.; Grossberg, S.; Markuzon, N.; Reynolds, J. H.; Rosen, D. B., (1992), Fuzzy RTMAP: A neural network architecture for incremental supervised learning of analog ultidimensional maps, IEEE Transactions on Neural Networks, 3: 698-713. http://dx.doi.org/10.1109/72.159059
Marriott, S.; Harrison, R. F. (1995), A modified fuzzy ARTMAP architecture for the approximation f noisy mappings, Neural Networks, 8(4): 619-641. http://dx.doi.org/10.1016/0893-6080(94)00110-8
Vigdor, B.; Lerner, B. (2007). The Bayesian ARTMAP, IEEE Transactions on Neural Networks 8: 1628-1644. http://dx.doi.org/10.1109/TNN.2007.900234
Sasu, L.M.; Andonie, R. (2013); Bayesian ARTMAP for regression, Neural Networks, ISSN 893-6080, 46: 23-31.
Shashua, A. (2009); Introduction to Machine Learning: Class Notes 67577, ttps://arxiv.org/abs/0904.3664v1
Vapnik, V. (1998); Statistical learning theory, New York: Wiley.
Schölkopf, B; Smola, A. J; Williamson, R. C.; Bartlett, P. L. (2000); New support vector lgorithms, Neural computation, 12 (5): 1207-1245. http://dx.doi.org/10.1162/089976600300015565
Published
Issue
Section
License
ONLINE OPEN ACCES: Acces to full text of each article and each issue are allowed for free in respect of Attribution-NonCommercial 4.0 International (CC BY-NC 4.0.
You are free to:
-Share: copy and redistribute the material in any medium or format;
-Adapt: remix, transform, and build upon the material.
The licensor cannot revoke these freedoms as long as you follow the license terms.
DISCLAIMER: The author(s) of each article appearing in International Journal of Computers Communications & Control is/are solely responsible for the content thereof; the publication of an article shall not constitute or be deemed to constitute any representation by the Editors or Agora University Press that the data presented therein are original, correct or sufficient to support the conclusions reached or that the experiment design or methodology is adequate.