2.765

2022影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于类分布的领域自适应支持向量机

应文豪 王士同 邓赵红 王骏

应文豪, 王士同, 邓赵红, 王骏. 基于类分布的领域自适应支持向量机. 自动化学报, 2013, 39(8): 1273-1288. doi: 10.3724/SP.J.1004.2013.01273
引用本文: 应文豪, 王士同, 邓赵红, 王骏. 基于类分布的领域自适应支持向量机. 自动化学报, 2013, 39(8): 1273-1288. doi: 10.3724/SP.J.1004.2013.01273
YING Wen-Hao, WANG Shi-Tong, DENG Zhao-Hong, WANG Jun. Support Vector Machine for Domain Adaptation Based on Class Distribution. ACTA AUTOMATICA SINICA, 2013, 39(8): 1273-1288. doi: 10.3724/SP.J.1004.2013.01273
Citation: YING Wen-Hao, WANG Shi-Tong, DENG Zhao-Hong, WANG Jun. Support Vector Machine for Domain Adaptation Based on Class Distribution. ACTA AUTOMATICA SINICA, 2013, 39(8): 1273-1288. doi: 10.3724/SP.J.1004.2013.01273

基于类分布的领域自适应支持向量机

doi: 10.3724/SP.J.1004.2013.01273
基金项目: 

国家自然科学基金(60975027, 61170122);江苏省自然科学基金(BK 2011003)资助

详细信息
    作者简介:

    王士同 江南大学数字媒体学院教授.主要研究方向为人工智能,模式识别和生物信息.E-mail: wxwangst@yahoo.com.cn

Support Vector Machine for Domain Adaptation Based on Class Distribution

Funds: 

Supported by National Natural Science Foundation of China (60975027, 61170122), and Natural Science Foundation of Jiangsu Province (BK2011003)

  • 摘要: 现有的领域自适应方法在定义领域间分布距离时, 通常仅从领域样本的整体分布上考虑, 而未对带类标签的领域样本分布分别进行考虑, 从而在一些具有非平衡数据集的应用领域上表现出一定的局限性. 对此, 在充分考虑源领域样本类信息的基础上, 基于结构风险最小化模型, 提出了基于类分布的领域自适应支持向量机(Domain adaptation support vector machine based on class distribution, CDASVM), 并将其拓展为可处理多源问题的多源领域自适应支持向量机(CDASVM from multiple sources, MSCDASVM), 在人造和真实的非平衡数据集上的实验结果表明, 所提方法具有优化或可比较的模式分类性能.
  • [1] Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359
    [2] Quanz B, Huan J. Large margin transductive transfer learning. In: Proceedings of the 18th ACM conference on Information and knowledge management. New York, USA: ACM, 2009. 1327-1336
    [3] Pan S J, Tsang I W, Kwok J T, Yang Q. Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 2011, 22(2): 199-210
    [4] Xiang E W, Cao B, Hu D H, Yang Q. Bridging domains using world wide knowledge for transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(6): 770-783
    [5] Bruzzone L, Marconcini M. Domain adaptation problems: a DASVM classification technique and a circular validation strategy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(5): 770-787
    [6] Ben-David S, Blitzer J, Crammer K, Pereira F. Analysis of representations for domain adaptation. In: Proceedings of the 2007 Advances in Neural Information Processing Systems 19. Cambridge: MIT Press, 2007. 137-144
    [7] Daumé H III, Marcu D. Domain adaptation for statistical classifiers. Journal of Artificial Intelligence Research, 2006, 26(1): 101-126
    [8] Ling X, Dai W Y, Xue G R, Yang Q, Yu Y. Spectral domain-transfer learning. In: Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining. New York, NY, USA: ACM, 2008. 488-496
    [9] Dai W Y, Xue G R, Yang Q, Yu Y. Co-clustering based classification for out-of-domain documents. In: Proceedings of the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2007: 210-219
    [10] Blitzer J, Dredze M, Pereira F. Biographies, bollywood, boom-boxes and blenders: domain adaptation for sentiment classification. In: Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics. Prague, Czekh Republic: Association for Computational Linguistics, 2007. 440-447
    [11] Sriperumbudur B K, Gretton A, Fukumizu K, Schölkopf, Lanckriet G R G. Hilbert space embeddings and metrics on probability measures. Journal of Machine Learning Research, 2010, 11: 1517-1561
    [12] Sriperumbudur B K, Fukumizu K, Gretton A, Lanckriet G R G, Schölkopf. Kernel choice and classifiability for RKHS embeddings of probability distributions. In: Proceedings of the 2009 Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 2009, 22: 1750-1758
    [13] Gretton A, Fukumizu K, Harchaoui Z, Sriperumbudur B K. A fast, consistent kernel two-sample test. In: Proceedings of the 2010 Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 2010. 673-681
    [14] Vapnik V N. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995. 123-167
    [15] Pal M, Foody G M. Feature selection for classification of hyper spectral data by SVM. IEEE Transactions on Geoscience and Remote Sensing, 2010, 48(5): 2297-2307
    [16] Liu Jian-Wei, Li Shuang-Cheng, Luo Xiong-Lin. Classification algorithm of support vector machine via p-norm regularization. Acta Automatica Sinica, 2012, 38(1): 76-87(刘建伟, 李双成, 罗雄麟. p范数正则化支持向量机分类算法. 自动化学报, 2012, 38(1): 76-87)
    [17] Liu Qiao, Qin Zhi-Guang, Chen Wei, Zhang Feng-Li. Zero-norm penalized feature selection support vector machine. Acta Automatica Sinica, 2011, 37(2): 252-256(刘峤, 秦志光, 陈伟, 张凤荔. 基于零范数特征选择的支持向量机模型. 自动化学报, 2011, 37(2): 252-256)
    [18] Hu Wen-Jun, Wang Shi-Tong. Fast real-time decision approach of support vector data description. Acta Automatica Sinica, 2011, 37(9): 1085-1094(胡文军, 王士同. SVDD的快速实时决策方法. 自动化学报, 2011, 37(9): 1085-1094)
    [19] Wang Xiao-Ming, Wang Shi-Tong. Theoretical analysis for the optimization problem of support vector data description. Journal of Software, 2011, 22(7): 1551-1560(王晓明, 王士同. 支撑向量数据域描述优化问题最优解理论分析. 软件学报, 2011, 22(7): 1551-1560)
    [20] Duan L X, Xu D, Tsang I W H. Domain adaptation from multiple sources: a domain-dependent regularization approach. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(3): 504-518
    [21] Duan L X, Tsang I W, Xu D, Chua T S. Domain adaptation from multiple sources via auxiliary classifiers. In: Proceedings of the 26th Annual International Conference on Machine Learning. New York, NY, USA: ACM, 2009. 289-296
    [22] Li S S, Zong C Q. Multi-domain adaptation for sentiment classification: using multiple classifier combining methods. In: Proceedings of the 2008 International Conference on Natural Language Processing and Knowledge Engineering. Beijing, China: IEEE, 2008. 1-8
    [23] Yang J, Yan R, Hauptmann A G. Cross-domain video concept detection using adaptive SVMs. In: Proceedings of the 15th International Conference on Multimedia. New York, NY: ACM, 2007. 188-197
    [24] Vapnik V N. Statistical Learning Theory. New York: John Wiley and Sons, 1998
    [25] Hofmann T, Schölkopf B, Smola A J. Kernel methods in machine learning. Annals of Statistics, 2007, 36(3): 1171-1220
    [26] Kim J, Scott C D. L2 kernel classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(10): 1822-1831
    [27] Schölkopf B, Herbrich R, Smola A J. A generalized representer theorem. In: Proceedings of the 14th Annual Conference on Computational Learning Theory and 5th European Conference on Computational Learning Theory (COLT' 2001). London, UK: Springer-Verlag, 2001. 416-426
    [28] Dai Y H, Chen H C, Peng T. Cost-sensitive support vector machine based on weighted attribute. In: Proceedings of the 2009 International Forum on Information Technology and Applications. Washington, DC: IEEE, 2009: 690-692
    [29] Joshi M V. On evaluating performance of classifiers for rare classes. In: Proceedings of the 2nd IEEE International Conference on Data Mining. Maebashi City, Japan, 2002. 641-644
  • 加载中
计量
  • 文章访问数:  1795
  • HTML全文浏览量:  57
  • PDF下载量:  1199
  • 被引次数: 0
出版历程
  • 收稿日期:  2012-03-16
  • 修回日期:  2012-07-15
  • 刊出日期:  2013-08-20

目录

    /

    返回文章
    返回