郭辉, 刘贺平. 基于核的偏最小二乘特征提取的最小二乘支持向量机回归方法[J]. 信息与控制, 2005, 34(4): 403-406.
引用本文: 郭辉, 刘贺平. 基于核的偏最小二乘特征提取的最小二乘支持向量机回归方法[J]. 信息与控制, 2005, 34(4): 403-406.
GUO Hui, LIU He-ping. A Least Square Support Vector Machine Regression Method Based on Kernel Partial Least Square Feature Extraction[J]. INFORMATION AND CONTROL, 2005, 34(4): 403-406.
Citation: GUO Hui, LIU He-ping. A Least Square Support Vector Machine Regression Method Based on Kernel Partial Least Square Feature Extraction[J]. INFORMATION AND CONTROL, 2005, 34(4): 403-406.

基于核的偏最小二乘特征提取的最小二乘支持向量机回归方法

A Least Square Support Vector Machine Regression Method Based on Kernel Partial Least Square Feature Extraction

  • 摘要: 提出了用核的偏最小二乘进行特征提取.首先把初始输入映射到高维特征空间,然后在高维特征空间中计算得分向量,降低样本的维数,再用最小二乘支持向量机进行回归.通过实验表明,这种方法得到的效果优于没有特征提取的回归.同时与PLS提取特征相比,KPLS分析效果更好.

     

    Abstract: We apply kernel partial least square(KPLS) to least square support vector machines (LSSVM) for feature extraction. The original inputs are firstly mapped into a high dimensional feature space, then score vectors are calculated in high dimensional feature space so that dimensions of the sample are reduced. Experimental results show that LSSVM by feature extraction using KPLS performs much better than that without feature extraction. In comparison with PLS, there is also superior performance in KPLS.

     

/

返回文章
返回