Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon free Acceleration for proximal stochastic dual coordinate ascent algorithm in solving regularised loss minimisation with norm

An accelerated version of the proximal stochastic dual coordinate ascent (SDCA) algorithm in solving regularised loss minimisation with norm is presented, wherein a momentum is introduced and the strong theoretical guarantees of SDCA are shared. Moreover, it is also suitable for various key machine learning optimisation problems including support vector machine (SVM), multiclass SVM, logistic regression, and ridge regression. In particular, the Nestrov's estimate sequence technique to adjust the weight coefficient dynamically and conveniently is adopted. It is applied for training linear SVM from the large training dataset. Experimental results show that the proposed method has a competitive classification performance and faster convergence speed than state-of-the-art algorithms.

References

    1. 1)
    2. 2)
      • 2. Shalev-Shwartz, S., Zhang, T.: ‘Stochastic dual coordinate ascent methods for regularized loss minimization’, J. Mach. Learn. Res., 2012, 14, (1), pp. 567599.
    3. 3)
      • 3. Hsieh, C.J., Chang, K.W., Lin, C.J., et al: ‘A dual coordinate descent method for large-scale linear SVM’, Int. Conf. Mach. Learn., 2008, 9, (3), pp. 408415, doi: 10.1145/1390156.1390208.
    4. 4)
    5. 5)
      • 6. Rumelhart, D.E.: ‘Parallel distributed processing’ (The MIT Press, Mcclelland, J.L., Cambridge, 1986).
    6. 6)
    7. 7)
      • 7. Nesterov, Y.: ‘Introductory lectures on convex optimization’ (Springer, Boston, 2004).
    8. 8)
      • 5. Lin, H., Mairal, J., Harchaoui, Z.: ‘A universal catalyst for first-order optimization’. In Proc. of the 28th Int. Conf. on NIPS, Montreal, Canada, December, Mathematics, 2015, vol. 2, pp. 33843392.
http://iet.metastore.ingenta.com/content/journals/10.1049/el.2017.4544
Loading

Related content

content/journals/10.1049/el.2017.4544
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address