2.765

2022影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

组合凸线性感知器的极大切割构造方法

冷强奎 李玉鑑

冷强奎, 李玉鑑. 组合凸线性感知器的极大切割构造方法. 自动化学报, 2014, 40(4): 721-730. doi: 10.3724/SP.J.1004.2014.00721
引用本文: 冷强奎, 李玉鑑. 组合凸线性感知器的极大切割构造方法. 自动化学报, 2014, 40(4): 721-730. doi: 10.3724/SP.J.1004.2014.00721
LENG Qiang-Kui, LI Yu-Jian. Maximal Cutting Construction for Multiconlitron. ACTA AUTOMATICA SINICA, 2014, 40(4): 721-730. doi: 10.3724/SP.J.1004.2014.00721
Citation: LENG Qiang-Kui, LI Yu-Jian. Maximal Cutting Construction for Multiconlitron. ACTA AUTOMATICA SINICA, 2014, 40(4): 721-730. doi: 10.3724/SP.J.1004.2014.00721

组合凸线性感知器的极大切割构造方法

doi: 10.3724/SP.J.1004.2014.00721
基金项目: 

国家自然科学基金(61175004),北京市自然科学基金(4112009),北京 市教委科技发展重点项目(KZ201210005007),高等学校博士学科点专项科研基金(20121103110029)资助

详细信息
    作者简介:

    冷强奎 北京工业大学计算机学院博士研究生.主要研究方向为模式识别与机器学习.E-mail:qkleng@gmail.com

Maximal Cutting Construction for Multiconlitron

Funds: 

Supported by National Natural Science Foundation of China (61175004), Natural Science Foundation of Beijing (4112009), Beijing Municipal Education Commission Science and Technology Development Plan Key Project (KZ201210005007), and Specialized Research Fund for the Doctoral Program of Higher Education (20121103110029)

  • 摘要: 组合凸线性感知器(Multiconlitron)是用来构造分片线性分类器的一个通用理论框架,对于凸可分和叠可分情况,分别使用支持凸线性感知器算法(Support conlitron algorithm,SCA)和支持组合凸线性感知器算法(Support multiconlitron algorithm,SMA)将两类样本分开. 本文在此基础上,提出了一种基于极大切割(Maximal cutting)的组合凸线性感知器构造方法. 该方法由两阶段训练构成,第一阶段称为极大切割过程(Maximal cutting process,MCP),通过迭代不断寻求能够切开最多样本的线性边界,并因此来构造尽可能小的决策函数集,最大程度减少决策函数集中线性函数的数量,最终简化分类模型. 第二阶段称为边界调整过程(Boundary adjusting process,BAP),对MCP得到的初始分类边界进行一个二次训练,调整边界到适当位置,以提高感知器的泛化能力. 数值实验说明,此方法能够产生更为合理的分类模型,提高了感知器的性能. 同其他典型分片线性分类器的性能对比,也说明了这种方法的有效性和竞争力.
  • [1] Webb D. Efficient Piecewise Linear Classifiers and Applications [Ph.D. dissertation], University of Ballarat, Australia, 2010
    [2] Nilsson N J. Learning Machines. New York: McGraw-Hill, 1965. 1-137
    [3] Mangasarian O L. Multisurface method of pattern separation. IEEE Transactions on Information Theory, 1968, 14(6): 801-807
    [4] Herman G T, Yeung K T D. On piecewise-linear classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1992, 14(7): 782-786
    [5] Sklansky J, Michelotti L. Locally trained piecewise linear classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1980, 2(2): 101-111
    [6] Park Y, Sklansky J. Automated design of multiple-class piecewise linear classifers. Journal of Classification, 1989, 6(1): 195-222
    [7] Tenmoto H, Kudo M, Shimbo M. Piecewise linear classifiers with an appropriate number of hyperplanes. Pattern Recognition, 1998, 31(11): 1627-1634
    [8] Cai B B, Huang T, Zhuang X H, Zhao Y X, Sklansky J. Piecewise linear classifiers using binary tree structure and genetic algorithm. Pattern Recognition, 1996, 29(11): 1905 -1917
    [9] Kostin A. A simple and fast multi-class piecewise linear pattern classifier. Pattern Recognition, 2006, 39(11): 19491962
    [10] Li Y J, Liu B, Yang X W, Fu Y Z, Li H J. Multiconlitron: a general piecewise linear classifier. IEEE Transactions on Neural Networks, 2011, 22(2): 276-289
    [11] Keerthi S S, Shevade S K, Bhattacharyya C, Murthy K R K. A fast iterative nearest point algorithm for support vector machine classifier design. IEEE Transactions on Neural Networks, 2000, 11(1): 124-136
    [12] Abdallah F, Richard C, Lengelle R. An improved training algorithm for nonlinear kernel discriminants. IEEE Transactions on Signal Processing, 2004, 52(10): 2798-2806
    [13] Gai K, Zhang C S. Learning discriminative piecewise linear models with boundary points. In: Proceedings of the 24th AAAI Conference on Artificial Intelligence. Georgia, USA: AAAI, 2010. 444-450
    [14] Cover T, Hart P. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 1967, 13(1): 21 -27
    [15] Fukunage K. Statistical pattern recognition. Handbook of Pattern Recognition and Computer Vision (4th edition). New Jersey: World Scientific, 2010. 33-60
    [16] Fan R E, Chang K W, Hsieh C J, Wang X R, Lin C J. LIBLINEAR: a library for large linear classification. Journal of Machine Learning Research, 2008, 9: 1871-1874
    [17] Zhou Lin, Ping Xi-Jian, Xu Sen, Zhang Tao. Cluster ensemble based on spectral clustering. Acta Automatica Sinica, 2012, 38(8): 1335-1342 (周林, 平西建, 徐森, 张涛. 基于谱聚类的聚类集成算法. 自动化学报, 2012, 38(8): 1335-1342)
    [18] Liu Bin, Wang Tao. An efficient convex hull algorithm for planar point set based on recursive method. Acta Automatica Sinica, 2012, 38(8): 1375-1379 (刘斌, 王涛. 一种高效的平面点集凸包递归算法. 自动化学报, 2012, 38(8): 1375-1379)
    [19] Cao Ying, Miao Qi-Guang, Liu Jia-Chen, Gao Lin. Advance and prospects of AdaBoost algorithm. Acta Automatica Sinica, 2013, 39(6): 745-758 (曹莹, 苗启广, 刘家辰, 高琳. AdaBoost算法研究进展与展望. 自动化学报, 2013, 39(6): 745-758)
  • 加载中
计量
  • 文章访问数:  1144
  • HTML全文浏览量:  48
  • PDF下载量:  732
  • 被引次数: 0
出版历程
  • 收稿日期:  2012-10-12
  • 修回日期:  2013-10-25
  • 刊出日期:  2014-04-20

目录

    /

    返回文章
    返回