Research article

Unsupervised domain adaptation through transferring both the source-knowledge and target-relatedness simultaneously

  • The authors marked with "§" are co-second authors. (Yanan Zhu and Yao Cheng contributed equally to this work.)
  • Received: 08 November 2022 Revised: 19 December 2022 Accepted: 21 December 2022 Published: 27 December 2022
  • Unsupervised domain adaptation (UDA) is an emerging research topic in the field of machine learning and pattern recognition, which aims to help the learning of unlabeled target domain by transferring knowledge from the source domain. To perform UDA, a variety of methods have been proposed, most of which concentrate on the scenario of single source and the single target domain (1S1T). However, in real applications, usually single source domain with multiple target domains are involved (1SmT), which cannot be handled directly by those 1S1T models. Unfortunately, although a few related works on 1SmT UDA have been proposed, nearly none of them model the source domain knowledge and leverage the target-relatedness jointly. To overcome these shortcomings, we herein propose a more general 1SmT UDA model through transferring both the source-knowledge and target-relatedness, UDA-SKTR for short. In this way, not only the supervision knowledge from the source domain but also the potential relatedness among the target domains are simultaneously modeled for exploitation in the process of 1SmT UDA. In addition, we construct an alternating optimization algorithm to solve the variables of the proposed model with a convergence guarantee. Finally, through extensive experiments on both benchmark and real datasets, we validate the effectiveness and superiority of the proposed method.

    Citation: Qing Tian, Yanan Zhu, Yao Cheng, Chuang Ma, Meng Cao. Unsupervised domain adaptation through transferring both the source-knowledge and target-relatedness simultaneously[J]. Electronic Research Archive, 2023, 31(2): 1170-1194. doi: 10.3934/era.2023060

    Related Papers:

  • Unsupervised domain adaptation (UDA) is an emerging research topic in the field of machine learning and pattern recognition, which aims to help the learning of unlabeled target domain by transferring knowledge from the source domain. To perform UDA, a variety of methods have been proposed, most of which concentrate on the scenario of single source and the single target domain (1S1T). However, in real applications, usually single source domain with multiple target domains are involved (1SmT), which cannot be handled directly by those 1S1T models. Unfortunately, although a few related works on 1SmT UDA have been proposed, nearly none of them model the source domain knowledge and leverage the target-relatedness jointly. To overcome these shortcomings, we herein propose a more general 1SmT UDA model through transferring both the source-knowledge and target-relatedness, UDA-SKTR for short. In this way, not only the supervision knowledge from the source domain but also the potential relatedness among the target domains are simultaneously modeled for exploitation in the process of 1SmT UDA. In addition, we construct an alternating optimization algorithm to solve the variables of the proposed model with a convergence guarantee. Finally, through extensive experiments on both benchmark and real datasets, we validate the effectiveness and superiority of the proposed method.



    加载中


    [1] J. Jiang, A literature survey on domain adaptation of statistical classifiers, 3 (2018), 1–12.
    [2] P. S. Jialin, Q. Yang, A survey on transfer learning, IEEE Trans. Knowl. Data Eng., 22 (2009), 1345–1359. https://doi.org/10.1109/TKDE.2009.191 doi: 10.1109/TKDE.2009.191
    [3] C. Wang, S. Mahadevan, Learning with augmented features for heterogeneous domain adaptation, in Proceedings of 22th International Joint Conference on Artificial Intelligence, (2011), 1541–1546. https://doi.org/10.5591/978-1-57735-516-8/IJCAI11-259
    [4] L. Duan, D. Xu, I. Tsang, Learning with augmented features for heterogeneous domain adaptation, arXiv preprint, (2011), arXiv: 1206.4660. https://doi.org/10.48550/arXiv.1206.4660
    [5] M. Wang, W. Deng, Deep visual domain adaptation: A survey, Neurocomputing, 312 (2018), 135–153. http://doi.org/10.1016/j.neucom.2018.05.083 doi: 10.1016/j.neucom.2018.05.083
    [6] S. M. Salaken, A. Khosravi, T. Nguyen, S. Nahavandi, Extreme learning machine based transfer learning algorithms: A survey, Neurocomputing, 267 (2017), 516–524. https://doi.org/10.1016/j.neucom.2017.06.037 doi: 10.1016/j.neucom.2017.06.037
    [7] Z. Zhou, A brief introduction to weakly supervised learning, Natl. Sci. Rev., 5 (2017), 44–53. https://doi.org/10.1093/nsr/nwx106 doi: 10.1093/nsr/nwx106
    [8] J. Zhang, W. Li, P. Ogunbona, D. Xu, Recent advances in transfer learning for cross-dataset visual recognition: A problem-oriented perspective, ACM Comput. Surv., 52 (2020), 1–38. https://doi.org/10.1145/3291124 doi: 10.1145/3291124
    [9] J. Huang, A. Gretton, P. Ogunbona, K. Borgwardt, B. Schölkopf, A. J. Smola, Correcting sample selection bias by unlabeled data, in Advances in Neural Information Processing Systems 19, MIT Press, (2006), 601–608. https://doi.org/10.7551/mitpress/7503.003.0080
    [10] M. Sugiyama, S. Nakajima, H. Kashima, P. V. Buenau, M. Kawanabe, Direct importance estimation with model selection and its application to covariate shift adaptation, in Advances in Neural Information Processing Systems, (2007), 601–608.
    [11] S. Li, S. Song, G. Huang, Prediction reweighting for domain adaptation, IEEE Trans. Neural Networks Learn. Syst., 28 (2016), 1682–1695. https://doi.org/10.1109/TNNLS.2016.2538282 doi: 10.1109/TNNLS.2016.2538282
    [12] Y. Zhu, K. Ting, Z. Zhou, New class adaptation via instance generation in one-pass class incremental learning, in 2017 IEEE International Conference on Data Mining (ICDM), (2017), 1207–1212. https://doi.org/10.1109/ICDM.2017.163
    [13] B. Zadrozny, Learning and evaluating classifiers under sample selection bias, in Proceedings of the 21th International Conference on Machine Learning, 2004. https://doi.org/10.1145/1015330.1015425
    [14] J. Jiang, C. Zhai, Instance weighting for domain adaptation in NLP, in Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics, (2007), 264–271. https://aclanthology.org/P07-1034
    [15] R. Wang, M. Utiyama, L. Liu, K. Chen, E. Sumita, Instance weighting for neural machine translation domain adaptation, in Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, (2017), 1482–1488. https://doi.org/10.18653/v1/d17-1155
    [16] J. Blitzer, R. McDonald, F. Pereira, Domain adaptation with structural correspondence learning, in Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing, (2006), 120–128. https://doi.org/10.3115/1610075.1610094
    [17] M. Xiao, Y. Guo, Feature space independent semi-supervised domain adaptation via kernel matching, IEEE Trans. Pattern Anal. Mach. Intell., 37 (2014), 54–66. http://doi.org/10.1109/TPAMI.2014.2343216 doi: 10.1109/TPAMI.2014.2343216
    [18] S. Herath, M. Harandi, F. Porikli, Learning an invariant hilbert space for domain adaptation, in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2017), 3845–3854. http://doi.org/doi:10.1109/CVPR.2017.421
    [19] L. Zhang, S. Wang, G. Huang, W. Zuo, J. Yang, D. Zhang, Manifold criterion guided transfer learning via intermediate domain generation, IEEE Trans. Neural Networks Learn. Syst., 30 (2019), 3759–3773. https://doi.org/10.1109/TNNLS.2019.2899037 doi: 10.1109/TNNLS.2019.2899037
    [20] J. Fu, L. Zhang, B. Zhang, W. Jia, Guided Learning: A new paradigm for multi-task classification, in Lecture Notes in Computer Science, 10996 (2018), 239–246. https://doi.org/10.1007/978-3-319-97909-0_26
    [21] S. Sun, Z. Xu, M. Yang, Transfer learning with part-based ensembles, in Lecture Notes in Computer Science, (2013), 271–282. https://doi.org/10.1007/978-3-642-38067-9_24
    [22] L. Cheng, F. Tsung, A. Wang, A statistical transfer learning perspective for modeling shape deviations in additive manufacturing, IEEE Rob. Autom. Lett., 2 (2017), 1988–1993. http://doi.org/doi:10.1109/LRA.2017.2713238 doi: 10.1109/LRA.2017.2713238
    [23] Y. Wang, S. Chen, Soft large margin clustering, Inf. Sci., 232 (2013), 116–129. https://doi.org/10.1016/j.ins.2012.12.040 doi: 10.1016/j.ins.2012.12.040
    [24] W. Dai, Q. Yang, G. Xue, Y. Yong, Self-taught clustering, in Proceedings of the 25th International Conference on Machine Learning, (2008), 200–207. https://doi.org/10.1016/j.ins.2012.12.040
    [25] Z. Deng, Y. Jiang, F. Chung, H. Ishibuchi, K. Choi, S. Wang, Transfer prototype-based fuzzy clustering, IEEE Trans. Fuzzy Syst., 24 (2015), 1210–1232. http://doi.org/doi:10.1109/TFUZZ.2015.2505330 doi: 10.1109/TFUZZ.2015.2505330
    [26] H. Yu, M. Hu, S. Chen, Multi-target unsupervised domain adaptation without exactly shared categories, arXiv preprint, (2018), arXiv: 1809.00852. https://doi.org/10.48550/arXiv.1809.00852
    [27] Z. Ding, M. Shao, Y. Fu, Robust multi-view representation: A unified perspective from multi-view learning to domain adaption, in Proceedings of the 27th International Joint Conference on Artificial Intelligence, (2018), 5434–5440. https://doi.org/10.24963/ijcai.2018/767
    [28] Z. Pei, Z. Cao, M. Long, J. Wang, Multi-adversarial domain adaptation, in Proceedings of the 32th AAAI Conference on Artificial Intelligence, 32 (2018), 3934–3941. https://doi.org/10.1609/aaai.v32i1.11767
    [29] W. Jiang, W. Liu, F. Chung, Knowledge transfer for spectral clustering, Pattern Recognit., 81 (2018), 484–496. https://doi.org/10.1016/j.patcog.2018.04.018 doi: 10.1016/j.patcog.2018.04.018
    [30] Y. Ganin, E. Ustinova, H. Ajakan, P. Germain, H. Larochelle, F. Laviolette, et al., Domain-adversarial training of neural networks, J. Mach. Learn. Res., 17 (2016), 2096–2030. https://jmlr.org/papers/v17/15-239.html
    [31] A. J. Gallego, J. Calvo-Zaragoza, R. B. Fisher, Incremental unsupervised domain-adversarial training of neural networks, IEEE Trans. Neural Networks Learn. Syst., 32 (2020), 4864–4878. https://doi: 10.1109/TNNLS.2020.3025954 doi: 10.1109/TNNLS.2020.3025954
    [32] B. Sun, K. Saenko, Deep coral: Correlation alignment for deep domain adaptation, in Lecture Notes in Computer Science, (2016), 443–450. https://doi.org/10.1007/978-3-319-49409-8_35
    [33] S. Lee, D. Kim, N. Kim, S. G. Jeong, Drop to adapt: Learning discriminative features for unsupervised domain adaptation, in Proceedings of the IEEE/CVF International Conference on Computer Vision, (2019), 91–100.
    [34] D. B. Bhushan, K. Benjamin, F. Rémi, T. Devis, C. Nicolas, Deepjdot: Deep joint distribution optimal transport for unsupervised domain adaptation, in Lecture Notes in Computer Science, 11208 (2018), 447–463. https://doi.org/10.1007/978-3-030-01225-0_28
    [35] X. Fang, N. Han, J. Wu, Y. Xu, J. Yang, W. Wong, et al., Approximate low-rank projection learning for feature extraction, IEEE Trans. Neural Networks Learn. Syst., 29 (2018), 5228–5241. http://doi.org/10.1109/TNNLS.2018.2796133 doi: 10.1109/TNNLS.2018.2796133
    [36] B. Gong, Y. Shi, F. Sha, K. Grauman, Geodesic flow kernel for unsupervised domain adaptation, in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (2012), 2066–2073. http://doi.org/10.1109/CVPR.2012.6247911
    [37] S. Moschoglou, A. Papaioannou, C. Sagonas, J. Deng, I. Kotsia, S. Zafeiriou, Agedb: the first manually collected, in-the-wild age database, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, (2017), 51–59. https://doi.org/10.1109/CVPRW.2017.250
    [38] K. Ricanek, T. Tesafaye, Morph: A longitudinal image database of normal adult age-progression, in 7th International Conference on Automatic Face and Gesture Recognition (FGR06), (2006), 341–345. https://doi.org/10.1109/FGR.2006.78
    [39] B. Chen, C. Chen, W. Hsu, Cross-age reference coding for age-invariant face recognition and retrieval, in Lecture Notes in Computer Science, (2014), 768–783. https://doi.org/10.1007/978-3-319-10599-4_49
    [40] X. Zhu, S. Zhang, Y. Li, J. Zhang, L. Yang, Y. Fang, Low-rank sparse subspace for spectral clustering, IEEE Trans. Knowl. Data Eng., 31 (2018), 1532–1543. https://doi.org/10.1109/TKDE.2018.2858782 doi: 10.1109/TKDE.2018.2858782
    [41] L. T. Nguyen-Meidine, A. Belal, M. Kiran, J. Dolz, L. Blais-Morin, E. Granger, Unsupervised multi-target domain adaptation through knowledge distillation, in 2021 IEEE Winter Conference on Applications of Computer Vision (WACV), (2021), 1339–1347. https://doi.org/10.1109/WACV48630.2021.00138
    [42] B. Mirkin, Clustering: a data recovery approach, Chapman and Hall/CRC, 2005. https://doi.org/10.1201/9781420034912
    [43] Q. Tian, S. Chen, T. Ma, Ordinal space projection learning via neighbor classes representation, Comput. Vision Image Understanding, 174 (2018), 24–32. http://doi.org/10.1016/j.cviu.2018.06.003 doi: 10.1016/j.cviu.2018.06.003
    [44] X. Geng, Z. Zhou, K. Smith-Miles, Automatic age estimation based on facial aging patterns, IEEE Trans. Pattern Anal. Mach. Intell., 29 (2007), 2234–2240. http://doi.org/10.1109/TPAMI.2007.70733 doi: 10.1109/TPAMI.2007.70733
    [45] T. Serre, L. Wolf, T. Poggio, Object recognition with features inspired by visual cortex, in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), 2 (2005), 994–1000. http://doi.org/10.1109/CVPR.2005.254
    [46] Y. Xu, X. Fang, J. Wu, X. Li, D. Zhang, Discriminative transfer subspace learning via low-rank and sparse representation, IEEE Trans. Image Process., 25 (2015), 850–863. https://doi.org/10.1109/TIP.2015.2510498 doi: 10.1109/TIP.2015.2510498
    [47] Y. Jin, C. Qin, J. Liu, K. Lin, H. Shi, Y. Huang, et al., A novel domain adaptive residual network for automatic atrial fibrillation detection, Knowledge Based Syst., 203 (2020). https://doi.org/10.1016/j.knosys.2020.106122 doi: 10.1016/j.knosys.2020.106122
    [48] J. Jiao, J. Lin, M. Zhao, K. Liang, Double-level adversarial domain adaptation network for intelligent fault diagnosis, Knowledge Based Syst., 205 (2020). http://doi.org/10.1016/j.knosys.2020.106236 doi: 10.1016/j.knosys.2020.106236
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(945) PDF downloads(49) Cited by(0)

Article outline

Figures and Tables

Figures(10)  /  Tables(6)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog