The incoming IoT big data era requires efficient and resource-constrained mining of large sets of distributed data. This paper explores a possible approach to this end, combining the two emerging paradigms of Conditional Neural Networks with early exits and Fog Computing. Apart from describing the general framework, we provide four specific contributions. First, after reviewing the basic architectures of CDNNs with early exits and characterizing their computational capacity, we consider three basic algorithms for their supervised training (namely the End-to-End, Layer-Wise and Classifier-Wise training algorithms), and, then, formally characterize and compare the resulting tradeoffs in a Fog-supported implementation. Second, after presenting a reference architecture for the local classifiers equipping the considered CDNNs, we develop an optimized framework for the parallel and distributed setting of their decision thresholds. Third, we propose a greedy algorithm for placing the early exits efficiently on the considered CDNNs and prove its linear scaling complexity. Fourth, we analytically characterize in closed-form and analyze the energy performance of the optimal CDNN-onto-Fog mapping. Finally, extensive numerical tests are presented, in order to test and compare the energy-vs.-implementation complexity-vs.-accuracy performance of the resulting optimized CDNN-over-Fog platforms under the IoT-oriented SVHN and FER-2013 datasets.

Optimized training and scalable implementation of Conditional Deep Neural Networks with early exits for Fog-supported IoT applications / Baccarelli, E.; Scardapane, S.; Scarpiniti, M.; Momenzadeh, A.; Uncini, A.. - In: INFORMATION SCIENCES. - ISSN 0020-0255. - 521:(2020), pp. 107-143. [10.1016/j.ins.2020.02.041]

Optimized training and scalable implementation of Conditional Deep Neural Networks with early exits for Fog-supported IoT applications

Baccarelli E.;Scardapane S.;Scarpiniti M.
;
Momenzadeh A.;Uncini A.
2020

Abstract

The incoming IoT big data era requires efficient and resource-constrained mining of large sets of distributed data. This paper explores a possible approach to this end, combining the two emerging paradigms of Conditional Neural Networks with early exits and Fog Computing. Apart from describing the general framework, we provide four specific contributions. First, after reviewing the basic architectures of CDNNs with early exits and characterizing their computational capacity, we consider three basic algorithms for their supervised training (namely the End-to-End, Layer-Wise and Classifier-Wise training algorithms), and, then, formally characterize and compare the resulting tradeoffs in a Fog-supported implementation. Second, after presenting a reference architecture for the local classifiers equipping the considered CDNNs, we develop an optimized framework for the parallel and distributed setting of their decision thresholds. Third, we propose a greedy algorithm for placing the early exits efficiently on the considered CDNNs and prove its linear scaling complexity. Fourth, we analytically characterize in closed-form and analyze the energy performance of the optimal CDNN-onto-Fog mapping. Finally, extensive numerical tests are presented, in order to test and compare the energy-vs.-implementation complexity-vs.-accuracy performance of the resulting optimized CDNN-over-Fog platforms under the IoT-oriented SVHN and FER-2013 datasets.
2020
CDNNs with early exits; energy efficiency-vs.-accuracy-vs.-implementation complexity trade-off; fog computing; IoT Big data stream; memory footprint; optimized CDNN-over-Fog execution platforms; supervised distributed training
01 Pubblicazione su rivista::01a Articolo in rivista
Optimized training and scalable implementation of Conditional Deep Neural Networks with early exits for Fog-supported IoT applications / Baccarelli, E.; Scardapane, S.; Scarpiniti, M.; Momenzadeh, A.; Uncini, A.. - In: INFORMATION SCIENCES. - ISSN 0020-0255. - 521:(2020), pp. 107-143. [10.1016/j.ins.2020.02.041]
File allegati a questo prodotto
File Dimensione Formato  
Baccarelli_Optimized-training_2020.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 4.94 MB
Formato Adobe PDF
4.94 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1370829
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 22
  • ???jsp.display-item.citation.isi??? 17
social impact