一类非平坦函数的多核最小二乘支持向量机的鲁棒回归算法

赵永平, 孙健国

赵永平, 孙健国. 一类非平坦函数的多核最小二乘支持向量机的鲁棒回归算法[J]. 信息与控制, 2008, 37(2): 160-165.
引用本文: 赵永平, 孙健国. 一类非平坦函数的多核最小二乘支持向量机的鲁棒回归算法[J]. 信息与控制, 2008, 37(2): 160-165.
ZHAO Yong-ping, SUN Jian-guo. A Non-flat Function Robust Regression Algorithm Using Multi-kernel LS-SVM[J]. INFORMATION AND CONTROL, 2008, 37(2): 160-165.
Citation: ZHAO Yong-ping, SUN Jian-guo. A Non-flat Function Robust Regression Algorithm Using Multi-kernel LS-SVM[J]. INFORMATION AND CONTROL, 2008, 37(2): 160-165.

一类非平坦函数的多核最小二乘支持向量机的鲁棒回归算法

基金项目: 国家自然科学基金资助项目(50576033)
详细信息
    作者简介:

    赵永平(1982- ),男,博士生.研究领域为控制理论和控制方法等.
    孙健国(1939- )男,教授,博士生导师.研究领域为控制,建模,故障诊断等.

  • 中图分类号: TP274

A Non-flat Function Robust Regression Algorithm Using Multi-kernel LS-SVM

  • 摘要: 给出了标准最小二乘支持向量机的数学回归模型,并提出了多核最小二乘支持向量机算法,用于提高非平坦函数的回归精度.运用谱系聚类方法解决多核最小二乘支持向量机的解缺乏稀疏性的问题.利用偏最小二乘回归方法对多核最小二乘支持向量机进行了鲁棒回归.通过仿真实例证实了所提方法的有效性.
    Abstract: The mathematical regression model of standard least squares support vector machine(LS-SVM) is presented and a multi-kernel least squares support vector machine(MLS-SVM) algorithm is proposed to enhance the regression accuracy of non-flat functions.The hierarchical clustering method is applied to deal with the problem that the solution of MLS-SVM is lack of sparseness.Partial least squares regression(PLSR) method is adopted to realize robust regression of MLS-SVM.A simulation example is given to validate the effectiveness of the presented method.
  • [1] Vapnik V.The Nature of Statistical Learning Theory[M].NewYork,USA:Springer-Verlag,1995.
    [2] Vapnik V,Golowich S E,$mota A.Support vector method for function approximation,regression estimation,and signal process-ing[A].Proceedings of the 10th Annual Conference on Neural Information Processing Systems[C].Cambridge,MA,USA:MIT Press,1997.281~287.
    [3] Smola A,Suholkopf B.A tutorial on support vector regression[J].Statistics and Computing,2004,14(3):199~222.
    [4] 许骏,柳泉波,史美林.协作社群形成与演化机制--理论与算法[M].北京:科学出版社,2005.
    [5] Cortes C,Vapnik V.Support-vector networks[J].Machine Learning,1995,20(3):273~297.
    [6] Suykeas J A K,Vandewalle J.Least squares support vector ma-chine classifiers[J].Neural Processing Letters,1999,9 (3):293~300.
    [7] Suykeas J A K,van Gestel T,de Brubanter J,et al.Least Squares Support Vector Machines[M].Singapore:Wodd Scien-tific Publishing,2002.
    [8] Zheng D N,Wang J X,Zhao Y N.Non-flat function estimation with a multi-scale support vector regression[J].Neurocomput-ing,2006,70(1-3):420-429,
    [9] Suykens J A K,Lukns L,Vandewalle J.Sparse approximation using least squares support vector machines[A].Proceedings of the 2000 IEEE International Symposium on Circuits and Systems[C].Piscataway,NJ,USA:IEEE,2000.757~760.
    [10] de Kruif B J,de Vries T J A.Pruning error minimization in least squares support vector machines[J].IEEE Transactions on Neu-ral Networks,2003,14(3):696~702.
    [11] Zeng X Y,Chert X W.SMO-based pruning methods for sparse least squares support vector machines[J].IEEE Transactions on Neural Networks,2005,16(6):1541~1546.
    [12] Webb A R.Statistical Pattern Recognition[M].UK:John Wiley & Suns,2004.
    [13] Valyan J,Horvath G.A sparse least squares support vector ma-chine classifier[A].Proceedings of the 2004 IEEE International Joint Conference on Neural Networks[C].Piscalaway,NJ,USA:IEEE,2004.543-548.
    [14] Valyon J,Horvath G.A robust LS-SVM regression[J].Engi-neering,Computing and Technology,2005,7:148~153.
    [15] 王惠文.偏最小二乘回归方法及其应用[M].北京:国防工业出版社,1999.
    [16] Suykens J A K,De Brabanter J,Lnkas L,et al.Weighted least squares support vector machines:Robustness and sparse approxi-mation[J].Neurocompoting,2002,48(1-4):85~105.
计量
  • 文章访问数:  2334
  • HTML全文浏览量:  0
  • PDF下载量:  206
  • 被引次数: 0
出版历程
  • 收稿日期:  2007-01-07
  • 发布日期:  2008-04-19

目录

    /

    返回文章
    返回
    x