Research article Special Issues

FedSC: A federated learning algorithm based on client-side clustering

  • Received: 19 January 2023 Revised: 27 June 2023 Accepted: 29 June 2023 Published: 19 July 2023
  • In traditional centralized machine learning frameworks, the consolidation of all data in a central data center for processing poses significant concerns related to data privacy breaches and data sharing complexities. In contrast, federated learning presents a privacy-preserving paradigm by training models on local devices, thus circumventing the need for data transfer. However, in the case of non-IID (non-independent and identically distributed) data distribution, the performance of federated learning will drop. Addressing this predicament, this study introduces the FedSC algorithm as a remedy. The FedSC algorithm initially partitions clients into clusters based on the distribution of their data types. Within each cluster, clients exhibit comparable local optimal solutions, thus facilitating the aggregation of a superior global model. Moreover, the global model trained by the previous cluster serves as the initial model parameter for subsequent clusters, enabling the incorporation of data contributions from each cluster to foster the development of an enhanced global model. Experimental results corroborate the superiority of the FedSC algorithm over alternative federated learning approaches, particularly in non-IID data distributions, thereby establishing its capacity to achieve heightened accuracy.

    Citation: Zhuang Wang, Renting Liu, Jie Xu, Yusheng Fu. FedSC: A federated learning algorithm based on client-side clustering[J]. Electronic Research Archive, 2023, 31(9): 5226-5249. doi: 10.3934/era.2023266

    Related Papers:

  • In traditional centralized machine learning frameworks, the consolidation of all data in a central data center for processing poses significant concerns related to data privacy breaches and data sharing complexities. In contrast, federated learning presents a privacy-preserving paradigm by training models on local devices, thus circumventing the need for data transfer. However, in the case of non-IID (non-independent and identically distributed) data distribution, the performance of federated learning will drop. Addressing this predicament, this study introduces the FedSC algorithm as a remedy. The FedSC algorithm initially partitions clients into clusters based on the distribution of their data types. Within each cluster, clients exhibit comparable local optimal solutions, thus facilitating the aggregation of a superior global model. Moreover, the global model trained by the previous cluster serves as the initial model parameter for subsequent clusters, enabling the incorporation of data contributions from each cluster to foster the development of an enhanced global model. Experimental results corroborate the superiority of the FedSC algorithm over alternative federated learning approaches, particularly in non-IID data distributions, thereby establishing its capacity to achieve heightened accuracy.



    加载中


    [1] K. Bayoumy, M. Gaber, A. Elshafeey, O. Mhaimeed, M. B. Elshazly, F. A. Marvel, et al., Smart wearable devices in cardiovascular care: where we are and how to move forward, Nat. Rev. Cardiol., 18 (2021), 581–599. https://doi.org/10.1038/s41569-021-00522-7 doi: 10.1038/s41569-021-00522-7
    [2] M. Y. Jeng, T. M. Yeh, F. Y. Pai, Analyzing older adults' perceived values of using smart bracelets by means–end chain, Healthcare, 8 (2020), 494. https://doi.org/10.3390/healthcare8040494 doi: 10.3390/healthcare8040494
    [3] Z. Lv, L. Qiao, M. S. Hossain, B. J. Choi, Analysis of using blockchain to protect the privacy of drone big data, IEEE Network, 35 (2021), 44–49. https://doi.org/10.1109/MNET.011.2000154 doi: 10.1109/MNET.011.2000154
    [4] M. Amiri-Zarandi, R. A. Dara, E. Fraser, A survey of machine learning-based solutions to protect privacy in the internet of things, Comput. Secur., 96 (2020), 101921. https://doi.org/10.1016/j.cose.2020.101921 doi: 10.1016/j.cose.2020.101921
    [5] Q. Li, Y. Diao, Q. Chen, B. He, Federated learning on non-IID data silos: an experimental study, in 2022 IEEE 38th International Conference on Data Engineering (ICDE), (2022), 965–978. https://doi.org/10.1109/ICDE53745.2022.00077
    [6] B. Mcmahan, E. Moore, D. Ramage, S. Hampson, B. Arcas, Communication-efficient learning of deep networks from decentralized data, in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, 54 (2017), 1273–1282. Available from: https://proceedings.mlr.press/v54/mcmahan17a.html.
    [7] C. Zhang, Y. Xie, H. Bai, B. Yu, W. Li, Y. Gao, A survey on federated learning, Knowledge-Based Syst., 216 (2021), 106775. https://doi.org/10.1016/j.knosys.2021.106775 doi: 10.1016/j.knosys.2021.106775
    [8] S. Truex, N. Baracaldo, A. Anwar, T. Steinke, H. Ludwig, R. Zhang, et al., A hybrid approach to privacy-preserving federated learning, in Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security, (2019), 1–11. https://doi.org/10.1145/3338501.3357370
    [9] B. Yu, W. Mao, Y. Lv, C. Zhang, Y. Xie, A survey on federated learning in data mining, Wiley Interdiscip. Rev.: Data Min. Knowl. Discovery, 12 (2022), e1443. https://doi.org/10.1002/widm.1443 doi: 10.1002/widm.1443
    [10] S. Aich, N. K. Sinai, S. Kumar, M. Ali, H. C. Kim, M. Joo, et al., Protecting personal healthcare record using blockchain & federated learning technologies, in 2022 24th International Conference on Advanced Communication Technology (ICACT), (2022), 109–112. https://doi.org/10.23919/ICACT53585.2022.9728772
    [11] T. Li, A. K. Sahu, A. Talwalkar, V. Smith, Federated learning: challenges, methods, and future directions, IEEE Signal Process Mag., 37 (2020), 50–60. https://doi.org/10.1109/MSP.2020.2975749 doi: 10.1109/MSP.2020.2975749
    [12] X. Yin, Y. Zhu, J. Hu, A comprehensive survey of privacy-preserving federated learning: a taxonomy, review, and future directions, ACM Comput. Surv., 54 (2021), 1–36. https://doi.org/10.1145/3460427 doi: 10.1145/3460427
    [13] T. Nishio, R. Yonetani, Client selection for federated learning with heterogeneous resources in mobile edge, in ICC 2019 - 2019 IEEE International Conference on Communications (ICC), (2019), 1–7. https://doi.org/10.1109/ICC.2019.8761315
    [14] Z. Chai, H. Fayyaz, Z. Fayyaz, A. Anwar, Y. Zhou, N. Baracaldo, et al., Towards taming the resource and data heterogeneity in federated learning, in 2019 USENIX Conference on Operational Machine Learning (OpML 19), (2019), 19–21. Available from: https://www.usenix.org/conference/opml19/presentation/chai.
    [15] Y. Jiang, G. Xu, Z. Fang, S. Song, B. Li, Heterogeneous fairness algorithm based on federated learning in intelligent transportation system, J. Comput. Methods Sci. Eng., 21 (2021), 1365–1373. https://doi.org/10.3233/JCM-214991 doi: 10.3233/JCM-214991
    [16] E. Diao, J. Ding, V. Tarokh, Heterofl: computation and communication efficient federated learning for heterogeneous clients, preprint, arXiv: 2010.01264.
    [17] T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, V. Smith, Federated optimization in heterogeneous networks, in Proceedings of Machine Learning and Systems, 2 (2020), 429–450. Available from: https://proceedings.mlsys.org/paper_files/paper/2020/file/1f5fe83998a09396ebe6477d9475ba0c-Paper.pdf.
    [18] H. B. Mcmahan, E. Moore, D. Ramage, B. Arcas, Federated learning of deep networks using model averaging, preprint, arXiv: 1602.05629.
    [19] P. Kairouz, H. B. McMahan, B. Avent, A. Bellet, M. Bennis, A. N. Bhagoji, et al., Advances and open problems in federated learning, Found. Trends Mach. Learn., 14 (2021), 1–210. http://dx.doi.org/10.1561/2200000083 doi: 10.1561/2200000083
    [20] Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, et al., A survey on federated learning systems: vision, hype and reality for data privacy and protection, IEEE Trans. Knowl. Data Eng., 35 (2021), 3347–3366. https://doi.org/10.1109/TKDE.2021.3124599 doi: 10.1109/TKDE.2021.3124599
    [21] Q. Yang, Y. Liu, T. Chen, Y. Tong, Federated machine learning: concept and applications, ACM Trans. Intell. Syst. Technol., 10 (2019), 1–19. https://doi.org/10.1145/3298981 doi: 10.1145/3298981
    [22] J. Wang, Q. Liu, H. Liang, G. Joshi, H. V. Poor, Tackling the objective inconsistency problem in heterogeneous federated optimization, in Advances in Neural Information Processing Systems, 33 (2020), 7611–7623.
    [23] S. P. Karimireddy, S. Kale, M. Mohri, S. J. Reddi, S. U. Stich, A. T. Suresh, Scaffold: stochastic controlled averaging for on-device federated learning, in Proceedings of the 37th International Conference on Machine Learning, 119 (2020), 5132–5143. Available from: https://proceedings.mlr.press/v119/karimireddy20a.html.
    [24] Y. Esfandiari, S. Y. Tan, Z. Jiang, A. Balu, E. Herron, C. Hegde, et al., Cross-gradient aggregation for decentralized learning from non-IID data, in Proceedings of the 38th International Conference on Machine Learning, 139 (2021), 3036–3046. Available from: https://proceedings.mlr.press/v139/esfandiari21a.html.
    [25] E. O. Box, K. Fujiwara, Vegetation types and their broad-scale distribution, in Vegetation Ecology, (2013), 455–485. https://doi.org/10.1002/9781118452592.ch15
    [26] S. Hu, Y. Li, X. Liu, Q. Li, Z. Wu, B. He, The oarf benchmark suite: characterization and implications for federated learning systems, ACM Trans. Intell. Syst. Technol., 13 (2022), 1–32. https://doi.org/10.1145/3510540 doi: 10.1145/3510540
    [27] M. Yurochkin, M. Agarwal, S. Ghosh, K. Greenewald, N. Hoang, Y. Khazaeni, Bayesian nonparametric federated learning of neural networks, in Proceedings of the 36th International Conference on Machine Learning, 97 (2019), 7252–7261. Available from: https://proceedings.mlr.press/v97/yurochkin19a.html.
    [28] H. Wang, M. Yurochkin, Y. Sun, D. Papailiopoulos, Y. Khazaeni, Federated learning with matched averaging, preprint, arXiv: 2002.06440.
    [29] T. Hsu, H. Qi, M. Brown, Measuring the effects of non-identical data distribution for federated visual classification, preprint, arXiv: 1909.06335.
    [30] X. Li, K. Huang, W. Yang, S. Wang, Z. Zhang, On the convergence of fedavg on non-IID data, preprint, arXiv: 1907.02189.
    [31] D. Acar, Y. Zhao, R. M. Navarro, M. Mattina, P. N. Whatmough, V. Saligrama, Federated learning based on dynamic regularization, preprint, arXiv: 2111.04263.
    [32] Q. Li, B. He, D. Song, Model-contrastive federated learning, in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), (2021), 10713–10722.
    [33] X. Li, M. Jiang, X. Zhang, M. Kamp, Q. Dou, Fedbn: federated learning on non-IID features via local batch normalization, preprint, arXiv: 2102.07623.
    [34] L. Wang, S. Xu, X. Wang, Q. Zhu, Addressing class imbalance in federated learning, in Proceedings of the AAAI Conference on Artificial Intelligence, 35 (2021), 10165–10173. https://doi.org/10.1609/aaai.v35i11.17219
    [35] S. Liu, J. Yu, X. Deng, S. Wan, Fedcpf: an efficient-communication federated learning approach for vehicular edge computing in 6G communication networks, IEEE Trans. Intell. Transp. Syst., 23 (2021), 1616–1629. https://doi.org/10.1109/TITS.2021.3099368 doi: 10.1109/TITS.2021.3099368
    [36] Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, V. Chandra, Federated learning with non-IID data, preprint, arXiv: 1806.00582.
    [37] N. Yoshida, T. Nishio, M. Morikura, K. Yamamoto, R. Yonetani, Hybrid-FL for wireless networks: cooperative learning mechanism using non-IID data, in ICC 2020 - 2020 IEEE International Conference on Communications (ICC), (2020), 1–7. https://doi.org/10.1109/ICC40277.2020.9149323
    [38] F. Sattler, K. R. Muller, W. Samek, Clustered federated learning: model-agnostic distributed multitask optimization under privacy constraints, IEEE Trans. Neural Networks Learn. Syst., 32 (2020), 3710–3722. https://doi.org/10.1109/TNNLS.2020.3015958 doi: 10.1109/TNNLS.2020.3015958
    [39] Z. Chai, A. Ali, S. Zawad, S. Truex, A. Anwar, N. Baracaldo, et al., Tifl: a tier-based federated learning system, in Proceedings of the 29th International Symposium on High-Performance Parallel and Distributed Computing, (2020), 125–136. https://doi.org/10.1145/3369583.3392686
    [40] Y. Lecun, L. Bottou, Y. Bengio, P. Haffner, Gradient-based learning applied to document recognition, Proc. IEEE, 86 (1998), 2278–2324. https://doi.org/10.1109/5.726791 doi: 10.1109/5.726791
    [41] H. Xiao, K. Rasul, R. Vollgraf, Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms, preprint, arXiv: 1708.07747.
    [42] R. Panigrahi, S. Borah, A detailed analysis of cicids2017 dataset for designing intrusion detection systems, Int. J. Eng. Technol., 7 (2018), 479–482.
    [43] L. Muñoz-González, K. T. Co, E. C. Lupu, Byzantine-robust federated machine learning through adaptive model averaging, preprint, arXiv: 1909.05125.
    [44] P. Blanchard, E. Mhamdi, R. Guerraoui, J. Stainer, Machine learning with adversaries: byzantine tolerant gradient descent, in Advances in Neural Information Processing Systems, 30 (2017). Available from: https://proceedings.neurips.cc/paper_files/paper/2017/file/f4b9ec30ad9f68f89b29639786cb62ef-Paper.pdf.
    [45] K. Varma, Y. Zhou, N. Baracaldo, A. Anwar, Legato: a layerwise gradient aggregation algorithm for mitigating byzantine attacks in federated learning, in 2021 IEEE 14th International Conference on Cloud Computing (CLOUD), (2021), 272–277. https://doi.org/10.1109/CLOUD53861.2021.00040
    [46] K. Zhang, W. Zuo, Y. Chen, D. Meng, L. Zhang, Beyond a gaussian denoiser: residual learning of deep cnn for image denoising, IEEE Trans. Image Process., 26 (2017), 3142–3155. https://doi.org/10.1109/TIP.2017.2662206 doi: 10.1109/TIP.2017.2662206
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1389) PDF downloads(97) Cited by(0)

Article outline

Figures and Tables

Figures(14)  /  Tables(8)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog