Skip to main content
Springer Nature Link
Account
Menu
Find a journal Publish with us Track your research
Search
Saved research
Cart
  1. Home
  2. International Journal of Computational Intelligence Systems
  3. Article

Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters

  • Research article
  • Open access
  • Published: 02 March 2020
  • Volume 13, pages 212–222, (2020)
  • Cite this article

You have full access to this open access article

Download PDF
View saved research
International Journal of Computational Intelligence Systems Aims and scope Submit manuscript
Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters
Download PDF
  • Chaoyu Yang1,
  • Jie Yang2 &
  • Jun Ma3 
  • 137 Accesses

  • 2 Citations

  • Explore all metrics

Abstract

In this paper, we propose an efficient Least Squares Support Vector Machine (LS-SVM) training algorithm, which incorporates sparse representation and dictionary learning. First, we formalize the LS-SVM training as a sparse representation process. Second, kernel parameters are adjusted by optimizing their average coherence. As such, the proposed algorithm addresses the training problem via generating the sparse solution and optimizing kernel parameters simultaneously. Experimental results demonstrate that the proposed algorithm is capable of achieving competitive performance compared to state-of-the-art approaches.

Article PDF

Download to read the full article text

Similar content being viewed by others

A Recursive Learning Algorithm for the Least Squares SVM

Chapter © 2025

Efficient Sparse Approximation of Support Vector Machines Solving a Kernel Lasso

Chapter © 2017

Sparse Least Square SVM in Primal via Nesterov Accelerated Alternating Directions Method of Multipliers

Chapter © 2026

Explore related subjects

Discover the latest articles, books and news in related subjects, suggested using machine learning.
  • Algorithms
  • Learning algorithms
  • Machine Learning
  • Optimization
  • Statistical Learning
  • Stochastic Learning and Adaptive Control

References

  1. A. Kuh, P.D. Wilde, Comments on “Pruning error minimization in least squares support vector machines.” IEEE Trans. Neural Netw. 18 (2007), 606–609.

  2. J.A.K. Suykens, L. Lukas, J. Vandewalle, Sparse approximation using least squares support vector machines, IEEE Int. Symp. Circuits Syst. 2 (2000), 757–760.

    Google Scholar 

  3. L. Yuan, C. Lin, W. Zhang, Improved sparse least squares support vector machine classifiers, Neurocomputing. 69 (2006), 1655–1658.

    Google Scholar 

  4. R. Mall, J.A.K. Suykens, Very sparse LSSVM reductions for large-scale data, IEEE Trans. Neural Netw. Learn. Syst. 26 (2015), 1086–1097.

    Google Scholar 

  5. D.A. Silva, J.P. Silva, A.R.R. Neto, Novel approaches using evolutionary computation for sparse least squares support vector machines, Neurocomputing. 168 (2015), 908–916.

    Google Scholar 

  6. J. Yang, J. Ma, A sparsity-based training algorithm for least squares SVM, in 2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM), Orlando, FL, USA, 2014, pp. 345–350.

  7. Y. Tan, Y.C. Fang, Y. Li, W. Dai, Adaptive kernel size selection for correntropy based metric, in: J. Park, J. Kim (Eds.), Computer Vision – ACCV 2012 Workshops, Springer, Berlin, Heidelberg, 2013, pp. 50–60.

  8. C.L. Bcklin, C. Andersson, M.G. Gustafsson, Self-tuning density estimation based on Bayesian averaging of adaptive kernel density estimations yields state-of-the-art performance, Pattern Recognit. 78 (2018), 133–143.

    Google Scholar 

  9. S.G. Vega, X.J. Zeng, J. Keane, Learning from data streams using kernel least-mean-square with multiple kernel-sizes and adaptive step-size, Neurocomputing. 339 (2019), 105–115.

    Google Scholar 

  10. L.X. Yang, M. Wang, S.Y. Yang, R. Zhang, P.T. Zhang, Sparse spatio-spectral LapSVM with semisupervised kernel propagation for hyperspectral image classification, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 10 (2017), 2046–2054.

    Google Scholar 

  11. Y.H. Shao, C.N. Li, M.Z.Liu, Z. Wang, N.Y. Deng, Sparse Lq-norm least squares support vector machine with feature selection, Pattern Recognit. 78 (2018), 167–181.

    Google Scholar 

  12. M. Yin, Z.Z. Wu, D.M. Shi, J.B. Gao, S.L. Xie, Locally adaptive sparse representation on Riemannian manifolds for robust classification, Neurocomputing. 310 (2018), 69–76.

    Google Scholar 

  13. J. Wang, S. Daming, D.S. Cheng, Y.Q. Zhang, J.B. Gao, LRSR: lowrank-sparse representation for subspace clustering, Neurocomputing. 214 (2016), 1026–1037.

  14. J. Yang, J. Ma, Compressive sensing-enhanced feature selection and its application in travel mode choice prediction, Appl. Soft Comput. 75 (2019), 537–547.

    Google Scholar 

  15. H.F. Zhu, G.H. Yang, W. Chen, Efficient implementations of orthogonal matching pursuit based on inverse cholesky factorization, in 2013 IEEE 78th Vehicular Technology Conference (VTC Fall), Las Vegas, NV, USA, 2013, pp. 1–5.

  16. M. Elad, Optimized projections for compressed sensing, IEEE Trans. Signal Process. 55 (2007), 5695–5702.

    Google Scholar 

  17. C.Y. Lu, H. Li, Z.C. Lin, Optimized projections for compressed sensing via direct mutual coherence minimization, Signal Process. 151 (2018), 45–55.

    Google Scholar 

  18. M.L. Xie, Z.X. Ji, G.Q. Zhang, T. Wang, Q.S. Sun, Mutually exclusive-KSVD: learning a discriminative dictionary for hyperspectral image classification, Neurocomputing. 315 (2018), 177–189.

    Google Scholar 

  19. I. Tosic, P. Frossard, Dictionary learning: what is the right representation for my signal?, IEEE Signal Process. Mag. 28 (2011), 27–38.

    Google Scholar 

  20. D. Dua, C. Graff, UCI Machine Learning Repository, School of Information and Computer Sciences, University of California, Irvine, 2017.

  21. J. Yang, J. Ma, Feed-forward neural network training using sparse representation, Expert Syst. Appl. 116 (2019), 255–264.

    Google Scholar 

  22. L.X. Yang, S.Y. Yang, S.J. Li, R. Zhang, F. Liu, L.C. Jiao, Coupled compressed sensing inspired sparse spatial-spectral LSSVM for hyperspectral image classificatio, Knowl. Based Syst. 79 (2015), 80–89.

    Google Scholar 

Download references

Author information

Authors and Affiliations

  1. School of Economics and Management, Anhui University of Science and Technology, Huainan, 232001, China

    Chaoyu Yang

  2. School of Computing and Information Technology, Faculty of Engineering and Information Sciences, University of Wollongong, Wollongong, 2522, NSW, Australia

    Jie Yang

  3. Operations Delivery Division, Sydney Trains, Alexandria, 2015, NSW, Australia

    Jun Ma

Authors
  1. Chaoyu Yang
    View author publications

    Search author on:PubMed Google Scholar

  2. Jie Yang
    View author publications

    Search author on:PubMed Google Scholar

  3. Jun Ma
    View author publications

    Search author on:PubMed Google Scholar

Corresponding author

Correspondence to Jie Yang.

Rights and permissions

This is an open access article distributed under the CC BY-NC 4.0 license (https://doi.org/creativecommons.org/licenses/by-nc/4.0/).

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, C., Yang, J. & Ma, J. Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters. Int J Comput Intell Syst 13, 212–222 (2020). https://doi.org/10.2991/ijcis.d.200205.001

Download citation

  • Received: 19 November 2019

  • Accepted: 28 January 2020

  • Published: 02 March 2020

  • Version of record: 02 March 2020

  • Issue date: January 2020

  • DOI: https://doi.org/10.2991/ijcis.d.200205.001

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Key words

  • Least squares support vector machine
  • Sparse representation
  • Dictionary learning
  • Kernel parameter optimization

Advertisement

Search

Navigation

  • Find a journal
  • Publish with us
  • Track your research

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Journal finder
  • Publish your research
  • Language editing
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our brands

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Discover
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support
  • Legal notice
  • Cancel contracts here

Not affiliated

Springer Nature

© 2026 Springer Nature