New SV selection strategy and local–global regularisation method for improving online SVM learning

New SV selection strategy and local–global regularisation method for improving online SVM learning

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

During the online learning process of support vector machines (SVMs), when a newly added sample is violating the Karush–Kuhn–Tucker) conditions, the new sample should be a new SV and transfer the old samples between the SVs and the non-SVs. Normally, the performance of an SVM model is decided by the SVs, and the model should be updated by the newly added SVs; therefore, the selection of high-quality candidate SVs will lead to a better learning accuracy, whereas low-quality candidate SVs may result in low learning efficiency and unnecessary updating. A new strategy is proposed to select the candidate SVs. SVs are selected according to two new criteria: the importance and the informativeness criteria. Furthermore, a mixed local–global regularisation method is applied during the online learning process to improve the penalty coefficients. Experiment results show that the proposed algorithm can achieve a better performance with a faster speed and a higher accuracy when compared with traditional methods.


    1. 1)
      • 1. Read, J., Bifet, A., Pfahringer, B., et al: ‘Batch-incremental versus instance-incremental learning in dynamic and evolving data’. Int. Conf. Advances in Intelligent Data Analysis, Helsinki, Finland, October 2012, pp. 313323.
    2. 2)
      • 2. Cauwenberghs, G., Poggio, T.: ‘Incremental and decremental support vector machine learning’. Int. Conf. Neural Information Processing Systems, Denver, CO, USA, January 2000, pp. 388394.
    3. 3)
      • 3. Lmeanu, H., Andonie, R.: ‘Implementation issues of an incremental and decremental SVM’. Int. Conf. Artificial Neural Networks, Prague, Czech Republic, September 2008, pp. 325335.
    4. 4)
      • 4. Diehl, C.P., Cauwenberghs, G.: ‘SVM incremental learning, adaptation and optimization’. Proc. Int. Joint Conf. Neural Networks, Portland, OR, USA, July 2003, vol. 4, pp. 26852690.
    5. 5)
      • 5. Singla, A., Patra, S.: ‘A fast partition-based batch-mode active learning technique using SVM classifier’, Soft Comput., 2017, pp. 111, doi:
    6. 6)
      • 6. Platt, J.C.: ‘Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods’, Adv. Large Margin Classifiers, 2000, 10, pp. 6174.
    7. 7)
    8. 8)
      • 8. Faris, H., Hassonah, M.A., Ala'M, A.-Z., et al: ‘A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture’, Neural Comput. Appl., 2017, pp. 115, doi:
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
      • 13. Bordes, A., Ertekin, S., Weston, J., et al: ‘Fast kernel classifiers with online and active learning’, J. Mach. Learn. Res., 2012, 6, (6), pp. 15791619.

Related content

This is a required field
Please enter a valid email address