系统工程与电子技术

• 软件、算法与仿真 • 上一篇    

#br# 基于自组合核的增量分类方法

冯林1,2, 张晶1,2, 吴振宇2   

  1. (1. 大连理工大学电子信息与电气工程学部计算机科学与技术学院, 辽宁 大连 116024;
    2. 大连理工大学创新创业学院, 辽宁 大连 116024)
  • 出版日期:2016-07-22 发布日期:2010-01-03

Incremental classification based on selfcompounding kernel

FENG Lin1,2, ZHANG Jing1, WU Zhenyu2   

  1. (1. School of Computer Science and Technology, Faculty of Electronic Information and
    Electrical Engineering, Dalian University of Technology, Dalian 116024, China; 2. School of
    Innovation Experiment, Dalian University of Technology, Dalian 116024, China)
  • Online:2016-07-22 Published:2010-01-03

摘要:

在线极端学习机(online sequential extreme learning machine,OSELM)模型在解决动态数据实时分类问题时,无需批量计算,仅保留前一时刻训练模型,根据当前时刻样本调整原有模型即可。然而,该增量方法在离线训练阶段随机指定隐层神经元使模型鲁棒性差,且求解过程难以拓展于核方法,降低了分类效果。针对上述问题,提出一种基于自组合核的在线极端学习机(selfcompounding kernel online sequential extreme learning machine,SCKOSELM)模型。首先,提出一种新的自组合核(selfcompounding kernel,SCK)方法,构建样本不同核空间的非线性特征组合,该方法可被应用于其他监督核方法中。其次,以稀疏贝叶斯为理论基础将训练数据的先验分布作为模型权值引入,并利用超参调整权值后验分布,从而达到对当前时间点参数稀疏的目的。最后,将稀疏得到的参数并入下一时刻运算。对动态数据的实时分类实验表明,该方法是一种有效的增量学习算法。相比于OSELM,该方法在解决动态数据实时分类问题时获得更稳定、准确的分类效果。

Abstract:

Online sequential kernel extreme learning machine (OSELM) is an increment classification algorithm, and it only keeps training model at last time, then adjusts the original model from the current samples. However, it does not batch calculation when solving the problem of realtime dynamic data classification. This method by minimizing the empirical risk leads to the overfitting, and randomly assigns hidden layer neurons in offline training, which makes the model have poor robust. Moreover, the solving process is difficult to be extended to the kernel method, which reduces the classification accuracy. Pointing to abovementioned problems, a new online classification method, selfcompounding kernels OSELM (SCKOSELM), is proposed based on the kernel method. Firstly, inputted samples are mapped to multikernel spaces to obtain different features, and the nonlinear combination of features are calculated. Proposed selfcompounding kernels method is used to others supervised kernel methods. Secondly, the prior distribution of training samples as model weights are introduced to maintain the model generalization, and by using the super weight to make the posterior distribution of weights to zero, thus sparse parameter is achieved. Finally, the parameter of sparse are incorporated into the next moment common operations. Numerical experiments indicate that the proposed method is effective.In comparison with OSELM, the proposed method has better performance in the sense of stability and classification accuracy, and is suitable for realtime dynamic data classification.