TY - GEN
T1 - MLMG
T2 - 2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events, PerCom Workshops 2021
AU - Qin, Yang
AU - Kondo, Masaaki
N1 - Funding Information:
This work was supported by JST CREST Grant Number JPMJCR20F2, Japan.
Publisher Copyright:
© 2021 IEEE.
PY - 2021/3/22
Y1 - 2021/3/22
N2 - Federated learning has attracted much interest and attention as a solution to collaboratively learn a prediction model without sharing the training data of users. Existing federated learning approaches usually develop a single independent local model for each client to train their privacy-sensitive data, afterward adopt a single centralized global model to exchange the trained parameters of clients that participate in federated training. However, given the diverse characteristics of local data and the heterogeneity across participating clients, the conventional federated learning paradigm may not achieve uniformly good performance over all users. In this work, we propose a novel federated learning mechanism which suggests using a Multi-Local and Multi-Global (MLMG) model aggregation to train the non-IID user data with clustering methods. Then a Matching algorithm is introduced to derive the appropriate exchanges between local models and global models. The new federated learning mechanism helps separate the data and user with different characteristics, thus makes it easier to capture the heterogeneity of data distributions across the users. We choose the latest on-device neural network for anomaly detection to evaluate the proposal, and experimental results based on several benchmark datasets demonstrate better detection accuracy (up to 2.83% accuracy improvement) of the novel paradigm compared with a conventional federated learning approach.
AB - Federated learning has attracted much interest and attention as a solution to collaboratively learn a prediction model without sharing the training data of users. Existing federated learning approaches usually develop a single independent local model for each client to train their privacy-sensitive data, afterward adopt a single centralized global model to exchange the trained parameters of clients that participate in federated training. However, given the diverse characteristics of local data and the heterogeneity across participating clients, the conventional federated learning paradigm may not achieve uniformly good performance over all users. In this work, we propose a novel federated learning mechanism which suggests using a Multi-Local and Multi-Global (MLMG) model aggregation to train the non-IID user data with clustering methods. Then a Matching algorithm is introduced to derive the appropriate exchanges between local models and global models. The new federated learning mechanism helps separate the data and user with different characteristics, thus makes it easier to capture the heterogeneity of data distributions across the users. We choose the latest on-device neural network for anomaly detection to evaluate the proposal, and experimental results based on several benchmark datasets demonstrate better detection accuracy (up to 2.83% accuracy improvement) of the novel paradigm compared with a conventional federated learning approach.
KW - aggregation mechanism
KW - anomaly detection
KW - federated learning
KW - multi-local and multi-global
KW - on-device neural network
UR - http://www.scopus.com/inward/record.url?scp=85107603128&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107603128&partnerID=8YFLogxK
U2 - 10.1109/PerComWorkshops51409.2021.9431011
DO - 10.1109/PerComWorkshops51409.2021.9431011
M3 - Conference contribution
AN - SCOPUS:85107603128
T3 - 2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events, PerCom Workshops 2021
SP - 565
EP - 571
BT - 2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events, PerCom Workshops 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 22 March 2021 through 26 March 2021
ER -