2.765

2022影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

分类器的动态选择与循环集成方法

郝红卫 王志彬 殷绪成 陈志强

郝红卫, 王志彬, 殷绪成, 陈志强. 分类器的动态选择与循环集成方法. 自动化学报, 2011, 37(11): 1290-1295. doi: 10.3724/SP.J.1004.2011.01290
引用本文: 郝红卫, 王志彬, 殷绪成, 陈志强. 分类器的动态选择与循环集成方法. 自动化学报, 2011, 37(11): 1290-1295. doi: 10.3724/SP.J.1004.2011.01290
HAO Hong-Wei, WANG Zhi-Bin, YIN Xu-Cheng, CHEN Zhi-Qiang. Dynamic Selection and Circulating Combination for Multiple Classifier Systems. ACTA AUTOMATICA SINICA, 2011, 37(11): 1290-1295. doi: 10.3724/SP.J.1004.2011.01290
Citation: HAO Hong-Wei, WANG Zhi-Bin, YIN Xu-Cheng, CHEN Zhi-Qiang. Dynamic Selection and Circulating Combination for Multiple Classifier Systems. ACTA AUTOMATICA SINICA, 2011, 37(11): 1290-1295. doi: 10.3724/SP.J.1004.2011.01290

分类器的动态选择与循环集成方法

doi: 10.3724/SP.J.1004.2011.01290
详细信息
    通讯作者:

    郝红卫 北京科技大学教授. 主要研究方向为图像处理与模式识别. E-mail: hhw@ustb.edu.cn

Dynamic Selection and Circulating Combination for Multiple Classifier Systems

  • 摘要: 针对多分类器系统设计中最优子集选择效率低下、集成方法缺乏灵活性等问题, 提出了分类器的动态选择与循环集成方法 (Dynamic selection and circulating combination, DSCC). 该方法利用不同分类器模型之间的互补性, 动态选择出对目标有较高识别率的分类器组合, 使参与集成的分类器数量能够随识别目标的复杂程度而自适应地变化, 并根据可信度实现系统的循环集成. 在手写体数字识别实验中, 与其他常用的分类器选择方法相比, 所提出的方法灵活高效, 识别率更高.
  • [1] Fumera G, Roli F, Serrau A. A theoretical analysis of bagging as a linear combination of classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 30(7): 1293-1299[2] Hao H W, Liu C L, Sako H. Comparison of genetic algorithm and sequential search methods for classifier subset selection. In: Proceedings of the 7th International Conference on Document Analysis and Recognition. Edinburgh, Scotland: IEEE, 2003. 765-769[3] Brown G. An information theoretic perspective on multiple classifier systems. In: Proceedings of the 8th International Workshop on Multiple Classifier Systems. Reykjavik, Iceland: Springer-Verlag, 2009. 344-353[4] Ruta D, Gabrys B. Classifier selection for majority voting. Information Fusion, 2005, 6(1): 63-81[5] Zhou Z H, Wu J X, Tang W. Ensembling neural networks: many could be better than all. Artificial Intelligence, 2002, 137(1-2): 239-263[6] Kang H J, Doermann D. Selection of classifiers for the construction of multiple classifier systems. In: Proceedings of the 8th International Conference on Document Analysis and Recognition. Seoul, Korea: IEEE, 2005. 1194-1198[7] Ko A H R, Sabourin R, Britto A S. From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition, 2008, 41(5): 1735-1748[8] Chen L, Kamel M S. A generalized adaptive ensemble generation and aggregation approach for multiple classifier systems. Pattern Recognition, 2009, 42(5): 629-644[9] Liu R, Yuan B. Multiple classifier combination by clustering and selection. Information Fusion, 2001, 2(3): 163-168[10] Li Guo-Zheng, Yang Jie, Kong An-Sheng, Chen Nian-Yi. Clustering algorithm based selective ensemble. Journal of Fudan University (Natural Science) , 2004, 43(5): 689-691(李国正, 杨杰, 孔安生, 陈念贻. 基于聚类算法的选择性神经网络集成. 复旦学报 (自然科学版), 2004, 43(5): 689-691)[11] Kim Y W, Oh I S. Classifier ensemble selection using hybrid genetic algorithms. Pattern Recognition Letters, 2008, 29(6): 796-802[12] Santos E M, Sabourin R, Maupin P. Over fitting cautious selection of classifier ensembles with genetic algorithms. Information Fusion, 2009, 10(2): 150-162[13] Jackowski K, Wozniak M. Method of classifier selection using the genetic approach. Expert Systems, 2010, 27(2): 114-128[14] Banfield R E, Hall L O, Bowyer K W, Kegelmeyer W P. Ensemble diversity measures and their application to thinning. Information Fusion, 2005, 6(1): 49-62[15] Didaci L, Giacinto G, Foli F, Marcialis G L. A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognition, 2005, 38(11): 2188-2191[16] Didaci L, Giacinto G. Dynamic classifier selection by adaptive K-nearest neighborhood rule. In: Proceedings of the 5th International Workshop on Multiple Classifier Systems. Cagliari, Italy: Springer-Verlag, 2004. 174-183[17] Kuncheva L I, Whitaker C J. Measures of diversity in classifier ensemble and their relationship with the ensemble accuracy. Machine Learning, 2003, 51(2): 181-207[18] Liu C L, Hao H W, Sako H. Confidence transformation for combining classifiers. Pattern Analysis and Application, 2004, 7(1): 2-17[19] Kherallah M, Haddad L, Alimi A M, Mitiche A. On-line handwritten digit recognition based on trajectory and velocity modeling. Pattern Recognition Letters, 2008, 29(5): 580-594[20] Chen Z. Handwritten digits recognition. In: Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition. Las Vegas, USA: CSREA, 2009. 690-694[21] Hao Hong-Wei, Jiang Rong-Rong. Training sample selection method for neural networks based on nearset neighbor rule. Acta Automatica Sinica, 2007, 33(12): 1247-1251(郝红卫, 蒋蓉蓉. 基于最近邻规则的神经网络训练样本选择方法. 自动化学报, 2007, 33(12): 1247-1251)
  • 加载中
计量
  • 文章访问数:  2484
  • HTML全文浏览量:  92
  • PDF下载量:  971
  • 被引次数: 0
出版历程
  • 收稿日期:  2010-12-03
  • 修回日期:  2011-06-13
  • 刊出日期:  2011-11-20

目录

    /

    返回文章
    返回