With the continuous improvement of the performance of the IoT and mobile devices, a new type of computing architecture, edge computing, came into being. The emergence of edge computing has changed the situation where data needs to be uploaded to the cloud for data processing, fully utilizing the computing and storage capabilities of edge IoT devices. Edge nodes process private data locally and no longer need upload a large amount of data to the cloud for processing, reducing the transmission delay. The demand for implementing artificial intelligence frameworks on edge nodes is also increasing day by day. Because the federated learning mechanism does not require centralized data for model training, it is more suitable for edge network machine learning scenarios where the average amount of data of nodes is limited. This paper proposes an efficient asynchronous federated learning mechanism for edge network computing (EAFLM), which compresses the redundant communication between the nodes and the parameter server during the training process according to the self-adaptive threshold. The gradient update algorithm based on dual-weight correction allows nodes to join or withdraw from federated learning during any process of learning. Experimental results show that when the gradient communication is compressed to 8.77% of the original communication times, the accuracy of the test set is only reduced by 0.03%.

An Asynchronous Federated Learning Mechanism for Edge Network Computing / Lu, X.; Liao, Y.; Lio, P.; Pan, H.. - In: JISUANJI YANJIU YU FAZHAN. - ISSN 1000-1239. - 57:12(2020), pp. 2571-2582. [10.7544/issn1000-1239.2020.20190754]

An Asynchronous Federated Learning Mechanism for Edge Network Computing

Lio P.;
2020

Abstract

With the continuous improvement of the performance of the IoT and mobile devices, a new type of computing architecture, edge computing, came into being. The emergence of edge computing has changed the situation where data needs to be uploaded to the cloud for data processing, fully utilizing the computing and storage capabilities of edge IoT devices. Edge nodes process private data locally and no longer need upload a large amount of data to the cloud for processing, reducing the transmission delay. The demand for implementing artificial intelligence frameworks on edge nodes is also increasing day by day. Because the federated learning mechanism does not require centralized data for model training, it is more suitable for edge network machine learning scenarios where the average amount of data of nodes is limited. This paper proposes an efficient asynchronous federated learning mechanism for edge network computing (EAFLM), which compresses the redundant communication between the nodes and the parameter server during the training process according to the self-adaptive threshold. The gradient update algorithm based on dual-weight correction allows nodes to join or withdraw from federated learning during any process of learning. Experimental results show that when the gradient communication is compressed to 8.77% of the original communication times, the accuracy of the test set is only reduced by 0.03%.
2020
Asynchronous distributed learning; Edge computing; Federated learning; Gradient compression; Privacy-preserving
01 Pubblicazione su rivista::01a Articolo in rivista
An Asynchronous Federated Learning Mechanism for Edge Network Computing / Lu, X.; Liao, Y.; Lio, P.; Pan, H.. - In: JISUANJI YANJIU YU FAZHAN. - ISSN 1000-1239. - 57:12(2020), pp. 2571-2582. [10.7544/issn1000-1239.2020.20190754]
File allegati a questo prodotto
File Dimensione Formato  
Lu_An-Ansynchronous_2020.pdf

accesso aperto

Note: DOI 10.7544/issn1000-1239.2020.20190754
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 4.99 MB
Formato Adobe PDF
4.99 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1719731
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? ND
social impact