电子学报 ›› 2013, Vol. 41 ›› Issue (4): 810-814.DOI: 10.3969/j.issn.0372-2112.2013.04.031

• 科研通信 • 上一篇    下一篇

基于随机子空间和AdaBoost的自适应集成方法

姚旭, 王晓丹, 张玉玺, 邢雅琼   

  1. 空军工程大学防空反导学院,陕西西安 710051
  • 收稿日期:2012-05-28 修回日期:2012-10-09 出版日期:2013-04-25
    • 作者简介:
    • 姚 旭 女,1982年10月出生,河北昌黎人,博士生.主要研究方向为智能信息处理和机器学习等. E-mail:ffxy132@163.com 王晓丹 女,1966年10月出生,陕西汉中人,教授,博士生导师.主要研究方向为模式识别,智能信息处理和机器学习等.
    • 基金资助:
    • 国家自然科学基金 (No.60975026,No.61273275)

A Self-Adaption Ensemble Algorithm Based on Random Subspace and AdaBoost

YAO Xu, WANG Xiao-dan, ZHANG Yu-xi, XING Ya-qiong   

  1. School of Air and Missile Defense, Air Force Engineering University, Xi'an, Shaanxi 710051, China
  • Received:2012-05-28 Revised:2012-10-09 Online:2013-04-25 Published:2013-04-25
    • Supported by:
    • National Natural Science Foundation of China (No.60975026, No.61273275)

摘要: 如何构造差异性大且精确度高的基分类器是集成学习的重点,为此提出一种新的集成学习方法——利用PSO寻找使得AdaBoost依样本权重抽取的数据集分类错误率最小化的最优特征权重分布,依据此最优权重分布对特征随机抽样生成随机子空间,并应用于AdaBoost的训练过程中.这就在增加分类器间差异性的同时保证了基分类器的准确度.最后用多数投票法融合各基分类器的决策结果,并通过仿真实验验证该方法的有效性.

关键词: 集成学习, 随机子空间, AdaBoost算法, 粒子群优化

Abstract: It is an open issue how to generate base classifiers with higher diversity and accuracy for ensemble learning.In this paper,a novel algorithm is proposed to solve this problem---particle swarm optimization is used to search for an optimal feature weight distribution which makes the classification error rate of training data sample by the distribution in AdaBoost minimal.Then,the feature subspace is constructed according to the optimal feature weight distribution,which is applied into the training process of AdaBoost.Thus,the accuracy of base classifier is advanced;meanwhile,the diversity between classifiers is improved.Finally,majority voting method is utilized to fuse the base classifiers' results and experiments have been done to attest the validity of the proposed algorithm.

Key words: ensemble learning, random subspace, AdaBoost algorithm, particle swarm optimization

中图分类号: