2.765

2022影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

一种基于局部加权均值的领域适应学习框架

皋军 黄丽莉 孙长银

皋军, 黄丽莉, 孙长银. 一种基于局部加权均值的领域适应学习框架. 自动化学报, 2013, 39(7): 1037-1052. doi: 10.3724/SP.J.1004.2013.01037
引用本文: 皋军, 黄丽莉, 孙长银. 一种基于局部加权均值的领域适应学习框架. 自动化学报, 2013, 39(7): 1037-1052. doi: 10.3724/SP.J.1004.2013.01037
GAO Jun, HUANG Li-Li, SUN Chang-Yin. A Local Weighted Mean Based Domain Adaptation Learning Framework. ACTA AUTOMATICA SINICA, 2013, 39(7): 1037-1052. doi: 10.3724/SP.J.1004.2013.01037
Citation: GAO Jun, HUANG Li-Li, SUN Chang-Yin. A Local Weighted Mean Based Domain Adaptation Learning Framework. ACTA AUTOMATICA SINICA, 2013, 39(7): 1037-1052. doi: 10.3724/SP.J.1004.2013.01037

一种基于局部加权均值的领域适应学习框架

doi: 10.3724/SP.J.1004.2013.01037
基金项目: 

国家自然科学基金(61272210, 60903100), 江苏省自然科学基金(BK2011417), 苏州大学江苏省计算机信息处理技术重点实验室开放课题(KJS1126), 江苏省新型环保重点实验室开放课题(AE201068), 江苏省高校优秀中青年教师和校长境外研修计划资助

详细信息
    通讯作者:

    孙长银

A Local Weighted Mean Based Domain Adaptation Learning Framework

Funds: 

Supported by National Natural Science Foundation of China (61272210, 60903100), Jiangsu Provincial Natural Science Foundation (BK2011417), Open Project of Jiangsu Provincial Key Laboratory for Computer Information Processing Technology (KJS1126), Open Project of Key Laboratory for Advanced Technology in Environmental Project of Jiangsu Province (AE201068), and Oversea Education Plan for Young Outstanding Teachers and Presidents of University in Jiangsu Province

  • 摘要: 最大均值差异(Maximum mean discrepancy, MMD)作为一种能有效度量源域和目标域分布差异的标准已被成功运用.然而, MMD作为一种全局度量方法一定程度上反映的是区域之间全局分布和全局结构上的差异.为此, 本文通过引入局部加权均值的方法和理论到MMD中, 提出一种具有局部保持能力的投影最大局部加权均值差异(Projected maximum local weighted mean discrepancy, PMLWD)度量,%从而一定程度上使得PMLWD更能有效度量源域和目标域中局部分块之间的分布和结构上的差异,结合传统的学习理论提出基于局部加权均值的领域适应学习框架(Local weighted mean based domain adaptation learning framework, LDAF), 在LDAF框架下, 衍生出两种领域适应学习方法: LDAF_MLC和 LDAF_SVM.最后,通过测试人工数据集、高维文本数据集和人脸数据集来表明LDAF比其他领域适应学习方法更具优势.
  • [1] Ozawa S, Roy A, Roussinov D. A multitask learning model for online pattern recognition. IEEE Transactions on Neural Networks, 2009, 20(3): 430-445
    [2] Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359
    [3] Quanz B, Huan J. Large margin transductive transfer learning. In: Proceedings of the 18th ACM Conference on Information and Knowledge Management (CIKM). New York, USA: ACM, 2009. 1327-1336
    [4] Zhang D, He J R, Liu Y, Si L, Lawrence R D. Multi-view transfer learning with a large margin approach. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2011. 1208-1216
    [5] Xu Z J, Sun S L. Multi-view Transfer learning with adaboost. In: Proceedings of the 23rd IEEE International Conference on Tools with Artificial Intelligence (ICTAI). New York, USA: IEEE, 2011. 399-402
    [6] Perez-Cruz F. Kullback-Leibler divergence estimation of continuous distributions. In: Proceedings of the 2008 IEEE International Symposium on Information Theory (ISIT) 2008. New York, USA: IEEE, 2008. 1666-1670
    [7] Borgwardt K M, Gretton A, Rasch M J, Kriegel H P, Schölkopf B, Smola A J. Integrating structured biological data by kernel maximum mean discrepancy. In: Proceedings of the 14th International Conference on Intelligent Systems for Molecular Biology (ISMB). California, USA: ISCB, 2006. e49-e57
    [8] Joachims T. Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning (ICML). San Francisco, CA: Morgan Kaufmann Publishers, 1999. 200-209
    [9] Ji S W, Tang L, Yu S P, Ye J P. Extracting shared subspace for multi-label classification. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2008. 381-389
    [10] Vapnik V N. Statistical Learning Theory. New York: Wiley, 1998. 88
    [11] Duan L X, Tsang I W, Xu D. Domain transfer multiple kernel learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(3): 465-479
    [12] Duan L X, Xu D, Tsang I W. Domain adaptation from multiple sources: a domain-dependent regularization approach. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(3): 504-518
    [13] Tao J W, Chung F L, Wang S T. On minimum distribution discrepancy support vector machine for domain adaptation. Pattern Recognition, 2012, 45(11): 3962-3984
    [14] Chen B, Lam W, Tsang I W, Wong T L. Extracting discriminative concepts for domain adaptation in text mining. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2009. 179-188
    [15] Zhang Z H, Zhou J. Multi-task clustering via domain adaptation. Pattern Recognition, 2012, 45(1): 465-473
    [16] Quanz B, Huan J, Mishra M. Knowledge transfer with low-quality data: a feature extraction issue. IEEE Transactions on Knowledge and Data Engineering, 2012, 24(10): 1789-1802
    [17] Lee J M. Riemannian Manifolds: An Introduction to Curvature. Berlin: Springer-Verlag, 2003. 1-4
    [18] Zhao D L, Lin Z C, Xiao R, Tang X O. Linear Laplacian discrimination for feature extraction. In: Proceedings of the 2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR). New York, USA: IEEE, 2007. 1-7
    [19] Wang Y Y, Chen S C, Zhou Z H. New semi-supervised classification method based on modified cluster assumption. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(5): 689-702
    [20] Atkeson C G, Moore A W, Schaal S. Locally weighted learning. Artificial Intelligence Review, 1997, 11(1-5): 11-73
    [21] Woods K, Kegelmeyer W P, Bowyer J. Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(4): 405-410
    [22] Sun S L. Local within-class accuracies for weighting individual outputs in multiple classifier systems. Pattern Recognition Letters, 2010, 31(2): 119-124
    [23] Sun S L, Zhang C S. Subspace ensembles for classification. Physica A: Statistical Mechanics and its Applications, 2007, 385(1): 199-207
    [24] Bregler C, Omohundro S M. Surface learning with applications to lipreading. In: Proceedings of the 1993 Neural Information Processing Systems (NIPS). Cambridge, MA: MIT Press, 1993. 43-50
    [25] Zhang W, Wang X G, Zhao D L, Tang X O. Graph degree linkage: agglomerative clustering on a directed graph. In: Proceedings of the 12th European Conference on Computer Vision (ECCV). Berlin: Springer-Verlag, 2012. 428-441
    [26] Deng Nai-Yang, Tian Ying-Jie. The New Method of Data-Mining — Support Vector Machine. Beijing: Science Press, 2004. 73-150 (邓乃阳, 田英杰. 数据挖掘中的新方法—支持向量机. 北京: 科学出版社, 2004. 73-150)
    [27] Kanamori T, Hido S, Sugiyama M. A least-squares approach to direct importance estimation. Journal of Machine Learning Research, 2009, 10: 1391-1445
    [28] Wang Z, Chen S C. New least squares support vector machines based on matrix patterns. Neural Processing Letters, 2007, 26(1): 41-56
    [29] Gao J, Fan W, Jiang J, Han J W. Knowledge transfer via multiple model local structure mapping. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2008. 283-291
    [30] Ling X, Dai W Y, Xue G R, Yang Q, Yu Y. Spectral domain-transfer learning. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2008. 488-496
    [31] Bruzzone L, Marconcini M. Domain adaptation problems: a DASVM classification technique and a circular validation strategy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(5): 770-787
  • 加载中
计量
  • 文章访问数:  1833
  • HTML全文浏览量:  64
  • PDF下载量:  1947
  • 被引次数: 0
出版历程
  • 收稿日期:  2012-10-23
  • 修回日期:  2013-01-15
  • 刊出日期:  2013-07-20

目录

    /

    返回文章
    返回