Special Issue: Theoretical perspectives of federated learning
Guest Editors
Prof. Yang Zhou
Auburn University, USA
Email: yangzhou@auburn.edu
Dr. Ji Liu
Baidu Research, China
Email: liuji04@baidu.com
Dr. Mohamed Reda Bouadjenek
Deakin University, Australia
Email: reda.bouadjenek@deakin.edu.au
Manuscript Topics
Rapid development of information technology and computing hardware, enhanced by the popularity of smartphones and Internet of Things devices, offers a great opportunity to train intelligent machine learning systems over largescale data to gain the ability to solve complex tasks. With the growing privacy and efficiency concerns, it is increasingly attractive to store and train the data locally across multiple devices, organizations, or institutions. Conventional centralized machine learning techniques train the models on a central server or a data center with the centralized storage of raw data collected from multiple sources, which incurs severe privacy and efficiency issues. Either distributed method with a centralized coordinator or fully decentralized method without a centralized coordinator can be exploited to enable federated learning to process the distributed data, which leverages the intermediate data instead of the raw data so as to protect data security and privacy. There is a crucial demand to develop a comprehensive federated learning framework from theoretical perspectives of distributed, decentralized, and federated learning to support intelligent applications with the distributed data. In addition, the theoretical perspectives for security, privacy, fairness, explainability, hyper-parameter tuning, and optimization based on federated learning are also critical for the future advancement of the learning processing with distributed data.
This special issue intends to bring together theoretical perspectives of federated learning that serve as fundamental tools to help realize the vision of future distributed or decentralized intelligent applications/systems through federated learning, while providing the community with the current state-of-the-art, revealing the important challenges, and studying the fundamental theoretical problems of federated learning with distributed data. We are looking for theoretical contributions that help breach the problems of federated learning with distributed data. Potential topics include but are not limited to:
• Federated learning, distributed learning, decentralized learning, and other forms of collaborative learning for distributed data processing;
• Distributed optimization, decentralized optimization for federated learning;
• Privacy, security, encryption, and fairness techniques for big distributed data;
• Privacy, security, encryption, and fairness techniques for federated learning;
• Efficient communication and computational techniques for federated learning;
• Techniques for heterogeneous and unbalanced distributed data (non-IID);
• Explainability of federated learning;
• Incentive mechanisms for federated learning;
• Model aggregation algorithms for federated learning;
• Graph techniques on federated learning;
• Parallelization and scheduling for federated learning
Instructions for authors
https://www.aimspress.com/era/news/solo-detail/instructionsforauthors
Please submit your manuscript to online submission system
https://aimspress.jams.pub/