Dynamic Selection and Circulating Combination for Multiple Classifier Systems
-
摘要: 针对多分类器系统设计中最优子集选择效率低下、集成方法缺乏灵活性等问题, 提出了分类器的动态选择与循环集成方法 (Dynamic selection and circulating combination, DSCC). 该方法利用不同分类器模型之间的互补性, 动态选择出对目标有较高识别率的分类器组合, 使参与集成的分类器数量能够随识别目标的复杂程度而自适应地变化, 并根据可信度实现系统的循环集成. 在手写体数字识别实验中, 与其他常用的分类器选择方法相比, 所提出的方法灵活高效, 识别率更高.Abstract: In order to deal with the problems of low efficiency and inflexibility for selecting the optimal subset and combining classifiers in multiple classifier systems, a new method of dynamic selection and circulating combination (DSCC) is proposed. This method dynamically selects the optimal subset with high accuracy for combination based on the complementarity of different classification models. The number of classifiers in the selected subset can be adaptively changed according to the complexity of the objects. Circulating combination is realized according to the confidence of classifiers. The experimental results of handwritten digit recognition show that the proposed method is more flexible, efficient and accurate comparing to other classifier selection methods.
-
[1] Fumera G, Roli F, Serrau A. A theoretical analysis of bagging as a linear combination of classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 30(7): 1293-1299[2] Hao H W, Liu C L, Sako H. Comparison of genetic algorithm and sequential search methods for classifier subset selection. In: Proceedings of the 7th International Conference on Document Analysis and Recognition. Edinburgh, Scotland: IEEE, 2003. 765-769[3] Brown G. An information theoretic perspective on multiple classifier systems. In: Proceedings of the 8th International Workshop on Multiple Classifier Systems. Reykjavik, Iceland: Springer-Verlag, 2009. 344-353[4] Ruta D, Gabrys B. Classifier selection for majority voting. Information Fusion, 2005, 6(1): 63-81[5] Zhou Z H, Wu J X, Tang W. Ensembling neural networks: many could be better than all. Artificial Intelligence, 2002, 137(1-2): 239-263[6] Kang H J, Doermann D. Selection of classifiers for the construction of multiple classifier systems. In: Proceedings of the 8th International Conference on Document Analysis and Recognition. Seoul, Korea: IEEE, 2005. 1194-1198[7] Ko A H R, Sabourin R, Britto A S. From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition, 2008, 41(5): 1735-1748[8] Chen L, Kamel M S. A generalized adaptive ensemble generation and aggregation approach for multiple classifier systems. Pattern Recognition, 2009, 42(5): 629-644[9] Liu R, Yuan B. Multiple classifier combination by clustering and selection. Information Fusion, 2001, 2(3): 163-168[10] Li Guo-Zheng, Yang Jie, Kong An-Sheng, Chen Nian-Yi. Clustering algorithm based selective ensemble. Journal of Fudan University (Natural Science) , 2004, 43(5): 689-691(李国正, 杨杰, 孔安生, 陈念贻. 基于聚类算法的选择性神经网络集成. 复旦学报 (自然科学版), 2004, 43(5): 689-691)[11] Kim Y W, Oh I S. Classifier ensemble selection using hybrid genetic algorithms. Pattern Recognition Letters, 2008, 29(6): 796-802[12] Santos E M, Sabourin R, Maupin P. Over fitting cautious selection of classifier ensembles with genetic algorithms. Information Fusion, 2009, 10(2): 150-162[13] Jackowski K, Wozniak M. Method of classifier selection using the genetic approach. Expert Systems, 2010, 27(2): 114-128[14] Banfield R E, Hall L O, Bowyer K W, Kegelmeyer W P. Ensemble diversity measures and their application to thinning. Information Fusion, 2005, 6(1): 49-62[15] Didaci L, Giacinto G, Foli F, Marcialis G L. A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognition, 2005, 38(11): 2188-2191[16] Didaci L, Giacinto G. Dynamic classifier selection by adaptive K-nearest neighborhood rule. In: Proceedings of the 5th International Workshop on Multiple Classifier Systems. Cagliari, Italy: Springer-Verlag, 2004. 174-183[17] Kuncheva L I, Whitaker C J. Measures of diversity in classifier ensemble and their relationship with the ensemble accuracy. Machine Learning, 2003, 51(2): 181-207[18] Liu C L, Hao H W, Sako H. Confidence transformation for combining classifiers. Pattern Analysis and Application, 2004, 7(1): 2-17[19] Kherallah M, Haddad L, Alimi A M, Mitiche A. On-line handwritten digit recognition based on trajectory and velocity modeling. Pattern Recognition Letters, 2008, 29(5): 580-594[20] Chen Z. Handwritten digits recognition. In: Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition. Las Vegas, USA: CSREA, 2009. 690-694[21] Hao Hong-Wei, Jiang Rong-Rong. Training sample selection method for neural networks based on nearset neighbor rule. Acta Automatica Sinica, 2007, 33(12): 1247-1251(郝红卫, 蒋蓉蓉. 基于最近邻规则的神经网络训练样本选择方法. 自动化学报, 2007, 33(12): 1247-1251)
点击查看大图
计量
- 文章访问数: 2484
- HTML全文浏览量: 92
- PDF下载量: 971
- 被引次数: 0