In the supervised learning domain, considering the recent prevalence of algorithms with high computational cost, the attention is steering towards simpler, lighter, and less computationally extensive training and inference approaches. In particular, randomized algorithms are currently having a resurgence, given their generalized elementary approach. By using randomized neural networks, we study distributed classification, which can be employed in situations were data cannot be stored at a central location nor shared. We propose a more efficient solution for distributed classification by making use of a lossy compression approach applied when sharing the local classifiers with other agents. This approach originates from the framework of hyperdimensional computing, and is adapted herein. The results of experiments on a collection of datasets demonstrate that the proposed approach has usually higher accuracy than local classifiers and getting close to the benchmark - the centralized classifier. This work can be considered as the first step towards analyzing the variegated horizon of distributed randomized neural networks.

Hyperdimensional computing for efficient distributed classification with randomized neural networks / Rosato, A.; Panella, M.; Kleyko, D.. - 2021:(2021), pp. 1-10. (Intervento presentato al convegno 2021 International Joint Conference on Neural Networks, IJCNN 2021 tenutosi a Shenzhen; China - Virtual) [10.1109/IJCNN52387.2021.9533805].

Hyperdimensional computing for efficient distributed classification with randomized neural networks

Rosato A.;Panella M.;
2021

Abstract

In the supervised learning domain, considering the recent prevalence of algorithms with high computational cost, the attention is steering towards simpler, lighter, and less computationally extensive training and inference approaches. In particular, randomized algorithms are currently having a resurgence, given their generalized elementary approach. By using randomized neural networks, we study distributed classification, which can be employed in situations were data cannot be stored at a central location nor shared. We propose a more efficient solution for distributed classification by making use of a lossy compression approach applied when sharing the local classifiers with other agents. This approach originates from the framework of hyperdimensional computing, and is adapted herein. The results of experiments on a collection of datasets demonstrate that the proposed approach has usually higher accuracy than local classifiers and getting close to the benchmark - the centralized classifier. This work can be considered as the first step towards analyzing the variegated horizon of distributed randomized neural networks.
2021
2021 International Joint Conference on Neural Networks, IJCNN 2021
hyperdimensional computing; random vector functional link networks; randomized neural networks
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Hyperdimensional computing for efficient distributed classification with randomized neural networks / Rosato, A.; Panella, M.; Kleyko, D.. - 2021:(2021), pp. 1-10. (Intervento presentato al convegno 2021 International Joint Conference on Neural Networks, IJCNN 2021 tenutosi a Shenzhen; China - Virtual) [10.1109/IJCNN52387.2021.9533805].
File allegati a questo prodotto
File Dimensione Formato  
Rosato_Hyperdimensional_2021.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.87 MB
Formato Adobe PDF
2.87 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1580318
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 1
social impact