Maximal Cutting Construction for Multiconlitron
-
摘要: 组合凸线性感知器(Multiconlitron)是用来构造分片线性分类器的一个通用理论框架,对于凸可分和叠可分情况,分别使用支持凸线性感知器算法(Support conlitron algorithm,SCA)和支持组合凸线性感知器算法(Support multiconlitron algorithm,SMA)将两类样本分开. 本文在此基础上,提出了一种基于极大切割(Maximal cutting)的组合凸线性感知器构造方法. 该方法由两阶段训练构成,第一阶段称为极大切割过程(Maximal cutting process,MCP),通过迭代不断寻求能够切开最多样本的线性边界,并因此来构造尽可能小的决策函数集,最大程度减少决策函数集中线性函数的数量,最终简化分类模型. 第二阶段称为边界调整过程(Boundary adjusting process,BAP),对MCP得到的初始分类边界进行一个二次训练,调整边界到适当位置,以提高感知器的泛化能力. 数值实验说明,此方法能够产生更为合理的分类模型,提高了感知器的性能. 同其他典型分片线性分类器的性能对比,也说明了这种方法的有效性和竞争力.Abstract: Multiconlitron is a general framework for constructing piecewise linear classifiers. For convexly separable and commonly separable data sets, it can separate them correctly by using support conlitron algorithm (SCA) and support multiconlitron algorithm (SMA), respectively. On this basis, the paper proposes a maximal cutting construction method for multiconlitron design. The method consists of two training processes. In the first step, the maximal cutting process (MCP) is utilized iteratively to find a linear boundary such that it can obtain the maximum number of samples. Thus, the MCP can reduce the number of linear boundaries and construct a minimal set of decision functions, and ultimately simplify the classification model. To improve the generalization ability further, in the second step we employ a boundary adjusting process (BAP) to make the classification boundaries more fittable. Experiments on both synthetic and real data sets show that the presented method can produce more reasonable multiconlitron with better performance. Comparison with some other piecewise linear classifiers verifies its effectiveness and competitiveness.
-
[1] Webb D. Efficient Piecewise Linear Classifiers and Applications [Ph.D. dissertation], University of Ballarat, Australia, 2010 [2] Nilsson N J. Learning Machines. New York: McGraw-Hill, 1965. 1-137 [3] Mangasarian O L. Multisurface method of pattern separation. IEEE Transactions on Information Theory, 1968, 14(6): 801-807 [4] Herman G T, Yeung K T D. On piecewise-linear classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1992, 14(7): 782-786 [5] Sklansky J, Michelotti L. Locally trained piecewise linear classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1980, 2(2): 101-111 [6] Park Y, Sklansky J. Automated design of multiple-class piecewise linear classifers. Journal of Classification, 1989, 6(1): 195-222 [7] Tenmoto H, Kudo M, Shimbo M. Piecewise linear classifiers with an appropriate number of hyperplanes. Pattern Recognition, 1998, 31(11): 1627-1634 [8] Cai B B, Huang T, Zhuang X H, Zhao Y X, Sklansky J. Piecewise linear classifiers using binary tree structure and genetic algorithm. Pattern Recognition, 1996, 29(11): 1905 -1917 [9] Kostin A. A simple and fast multi-class piecewise linear pattern classifier. Pattern Recognition, 2006, 39(11): 19491962 [10] Li Y J, Liu B, Yang X W, Fu Y Z, Li H J. Multiconlitron: a general piecewise linear classifier. IEEE Transactions on Neural Networks, 2011, 22(2): 276-289 [11] Keerthi S S, Shevade S K, Bhattacharyya C, Murthy K R K. A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Transactions on Neural Networks, 2000, 11(1): 124-136 [12] Abdallah F, Richard C, Lengelle R. An improved training algorithm for nonlinear kernel discriminants. IEEE Transactions on Signal Processing, 2004, 52(10): 2798-2806 [13] Gai K, Zhang C S. Learning discriminative piecewise linear models with boundary points. In: Proceedings of the 24th AAAI Conference on Artificial Intelligence. Georgia, USA: AAAI, 2010. 444-450 [14] Cover T, Hart P. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 1967, 13(1): 21 -27 [15] Fukunage K. Statistical pattern recognition. Handbook of Pattern Recognition and Computer Vision (4th edition). New Jersey: World Scientific, 2010. 33-60 [16] Fan R E, Chang K W, Hsieh C J, Wang X R, Lin C J. LIBLINEAR: a library for large linear classification. Journal of Machine Learning Research, 2008, 9: 1871-1874 [17] Zhou Lin, Ping Xi-Jian, Xu Sen, Zhang Tao. Cluster ensemble based on spectral clustering. Acta Automatica Sinica, 2012, 38(8): 1335-1342 (周林, 平西建, 徐森, 张涛. 基于谱聚类的聚类集成算法. 自动化学报, 2012, 38(8): 1335-1342) [18] Liu Bin, Wang Tao. An efficient convex hull algorithm for planar point set based on recursive method. Acta Automatica Sinica, 2012, 38(8): 1375-1379 (刘斌, 王涛. 一种高效的平面点集凸包递归算法. 自动化学报, 2012, 38(8): 1375-1379) [19] Cao Ying, Miao Qi-Guang, Liu Jia-Chen, Gao Lin. Advance and prospects of AdaBoost algorithm. Acta Automatica Sinica, 2013, 39(6): 745-758 (曹莹, 苗启广, 刘家辰, 高琳. AdaBoost算法研究进展与展望. 自动化学报, 2013, 39(6): 745-758)
点击查看大图
计量
- 文章访问数: 1144
- HTML全文浏览量: 48
- PDF下载量: 732
- 被引次数: 0