2.624

2020影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

考虑局部均值和类全局信息的快速近邻原型选择算法

李娟 王宇平

李娟, 王宇平. 考虑局部均值和类全局信息的快速近邻原型选择算法. 自动化学报, 2014, 40(6): 1116-1125. doi: 10.3724/SP.J.1004.2014.01116
引用本文: 李娟, 王宇平. 考虑局部均值和类全局信息的快速近邻原型选择算法. 自动化学报, 2014, 40(6): 1116-1125. doi: 10.3724/SP.J.1004.2014.01116
LI Juan, WANG Yu-Ping. A Fast Neighbor Prototype Selection Algorithm Based on Local Mean and Class Global Information. ACTA AUTOMATICA SINICA, 2014, 40(6): 1116-1125. doi: 10.3724/SP.J.1004.2014.01116
Citation: LI Juan, WANG Yu-Ping. A Fast Neighbor Prototype Selection Algorithm Based on Local Mean and Class Global Information. ACTA AUTOMATICA SINICA, 2014, 40(6): 1116-1125. doi: 10.3724/SP.J.1004.2014.01116

考虑局部均值和类全局信息的快速近邻原型选择算法

doi: 10.3724/SP.J.1004.2014.01116
基金项目: 

国家自然科学基金(61272119)资助

详细信息
    作者简介:

    李娟 西安电子科技大学计算机学院博士研究生,陕西师范大学远程教育学院讲师. 主要研究方向为数据挖掘,模式识别. E-mail:ally 2004@126.com

A Fast Neighbor Prototype Selection Algorithm Based on Local Mean and Class Global Information

Funds: 

Supported by National Natural Science Foundation of China (61272119)

  • 摘要: 压缩近邻法是一种简单的非参数原型选择算法,其原型选取易受样本读取序列、异常样本等干扰.为克服上述问题,提出了一个基于局部均值与类全局信息的近邻原型选择方法.该方法既在原型选取过程中,充分利用了待学习样本在原型集中k个同异类近邻局部均值和类全局信息的知识,又设定原型集更新策略实现对原型集的动态更新.该方法不仅能较好克服读取序列、异常样本对原型选取的影响,降低了原型集规模,而且在保持高分类精度的同时,实现了对数据集的高压缩效应.图像识别及UCI(University of California Irvine)基准数据集实验结果表明,所提出算法集具有较比较算法更有效的分类性能.
  • [1] Wu X D, Kumar V, Quinlan J R, Ghosh J, Yang Q, Motoda H. Top 10 algorithms in data mining. Knowledge and Information Systems, 2008, 14(1): 1-37
    [2] López J A O, Ochoa J A C, Trinidad J F M. Prototype selection methods. Computacióny Sistemas, 2010, 13(4): 449-462
    [3] Verbiesta N, Cornelisa C, Herrerab F. FRPS: a fuzzy rough prototype selection method. Pattern Recognition, 2013, 46(10): 2770-2782
    [4] Rico J R, Iňesta J M. New rank methods for reducing the size of the training set using the nearest neighbor rule. Pattern Recognition Letters, 2012, 33(5): 654-660
    [5] Angiulli F. Fast nearest neighbor condensation for large data sets classification. IEEE Transactions on Knowledge and Data Engineering, 2007, 19(11): 1450-1464
    [6] Chang F, Lin C C, Lu C J. Adaptive prototype learning algorithms: theoretical and experimental studies. Journal of Machine Learning Research, 2006, 7: 2125-2148
    [7] García S, Derrac J, Cano J R. Prototype selection for nearest neighbor classification: taxonomy and empirical study. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(2): 417-435
    [8] Wu Y Q, Lanakiev K, Govindaraju V. Improved k-nearest neighbor classification. Pattern Recognition, 2002, 35(10): 2311-2318
    [9] Olvera-López J A, Carrasco-Ochoa J A, Martínez-Trinidad J F. A new fast prototype selection method based on clustering. Pattern Analysis and Applications, 2010, 13(2): 131-141
    [10] Mitani Y, Hamamoto Y. A local mean-based nonparametric classifier. Pattern Recognition Letters, 2006, 27(10): 1151-1159
    [11] Brown T A, Koplowitz J. The weighted nearest neighbour rule for class dependent sample size. IEEE Transaction on Information Theory, 1979, 25(5): 617-619
    [12] Han E H, Karypis G. Centroid-Based Document Classification: analysis & Experimental Results. Technical Report 00-017, Computer Science, University of Minnesota, 2000
    [13] Zeng Y, Yang Y P, Zhao L. Nonparametric classification based on local mean and class statistics. Expert Systems with Applications, 2009, 36(4): 8443-8448
    [14] Brighton H, Mellish C. Advances in instance selection for instance-based learning algorithms. Data Mining and Knowledge Discovery, 2002, 6(2): 153-172
    [15] Wang X Z, Wu B, He Y L. An iterative algorithm for sample selection based on the reachable and coverage. In: Proceedings of the IEEE International Conference on Communications Technology and Applications. Beijing, China: IEEE, 2009. 521-526
    [16] Theodoridis S, Koutroumbas K. Pattern Recognition (Third Edition). New York: Elsevier, chapter 5, 2006
    [17] Xu Y, Shen F R, Zhao J X. An incremental learning vector quantization algorithm for pattern classification. Neural Computing and Applications, 2012, 21(6): 1205-1215
  • 加载中
计量
  • 文章访问数:  1621
  • HTML全文浏览量:  82
  • PDF下载量:  787
  • 被引次数: 0
出版历程
  • 收稿日期:  2013-06-19
  • 修回日期:  2013-11-11
  • 刊出日期:  2014-06-20

目录

    /

    返回文章
    返回