Fog Computing (FC) and Conditional Deep Neural Networks (CDDNs) with early exits are two emerging paradigms which, up to now, are evolving in a standing-Alone fashion. However, their integration is expected to be valuable in IoT applications in which resource-poor devices must mine large volume of sensed data in real-Time. Motivated by this consideration, this article focuses on the optimized design and performance validation of {L} earning-{i} ext{n}-The-Fo g (LiFo), a novel virtualized technological platform for the minimum-energy and delay-constrained execution of the inference-phase of CDDNs with early exits atop multi-Tier networked computing infrastructures composed by multiple hierarchically-organized wireless Fog nodes. The main research contributions of this article are threefold, namely: (i) we design the main building blocks and supporting services of the LiFo architecture by explicitly accounting for the multiple constraints on the per-exit maximum inference delays of the supported CDNN; (ii) we develop an adaptive algorithm for the minimum-energy distributed joint allocation and reconfiguration of the available computing-plus-networking resources of the LiFo platform. Interestingly enough, the designed algorithm is capable to self-detect (typically, unpredictable) environmental changes and quickly self-react them by properly re-configuring the available computing and networking resources; and, (iii) we design the main building blocks and related virtualized functionalities of an Information Centric-based networking architecture, which enables the LiFo platform to perform the aggregation of spatially-distributed IoT sensed data. The energy-vs.-inference delay LiFo performance is numerically tested under a number of IoT scenarios and compared against the corresponding ones of some state-of-The-Art benchmark solutions that do not rely on the Fog support.

Learning-in-the-Fog (LiFo): Deep learning meets Fog Computing for the minimum-energy distributed early-exit of inference in delay-critical IoT realms / Baccarelli, E.; Scarpiniti, M.; Momenzadeh, A.; Sarv Ahrabi, S.. - In: IEEE ACCESS. - ISSN 2169-3536. - 9:(2021), pp. 25716-25757. [10.1109/ACCESS.2021.3058021]

Learning-in-the-Fog (LiFo): Deep learning meets Fog Computing for the minimum-energy distributed early-exit of inference in delay-critical IoT realms

Baccarelli E.;Scarpiniti M.
;
Momenzadeh A.;Sarv Ahrabi S.
2021

Abstract

Fog Computing (FC) and Conditional Deep Neural Networks (CDDNs) with early exits are two emerging paradigms which, up to now, are evolving in a standing-Alone fashion. However, their integration is expected to be valuable in IoT applications in which resource-poor devices must mine large volume of sensed data in real-Time. Motivated by this consideration, this article focuses on the optimized design and performance validation of {L} earning-{i} ext{n}-The-Fo g (LiFo), a novel virtualized technological platform for the minimum-energy and delay-constrained execution of the inference-phase of CDDNs with early exits atop multi-Tier networked computing infrastructures composed by multiple hierarchically-organized wireless Fog nodes. The main research contributions of this article are threefold, namely: (i) we design the main building blocks and supporting services of the LiFo architecture by explicitly accounting for the multiple constraints on the per-exit maximum inference delays of the supported CDNN; (ii) we develop an adaptive algorithm for the minimum-energy distributed joint allocation and reconfiguration of the available computing-plus-networking resources of the LiFo platform. Interestingly enough, the designed algorithm is capable to self-detect (typically, unpredictable) environmental changes and quickly self-react them by properly re-configuring the available computing and networking resources; and, (iii) we design the main building blocks and related virtualized functionalities of an Information Centric-based networking architecture, which enables the LiFo platform to perform the aggregation of spatially-distributed IoT sensed data. The energy-vs.-inference delay LiFo performance is numerically tested under a number of IoT scenarios and compared against the corresponding ones of some state-of-The-Art benchmark solutions that do not rely on the Fog support.
2021
adaptive resource allocation and reconfiguration; conditional deep neural networks; distributed multi-Tier fog platforms; early exit of IoT inference; per-exit inference delays; virtualized networked computing architectures
01 Pubblicazione su rivista::01a Articolo in rivista
Learning-in-the-Fog (LiFo): Deep learning meets Fog Computing for the minimum-energy distributed early-exit of inference in delay-critical IoT realms / Baccarelli, E.; Scarpiniti, M.; Momenzadeh, A.; Sarv Ahrabi, S.. - In: IEEE ACCESS. - ISSN 2169-3536. - 9:(2021), pp. 25716-25757. [10.1109/ACCESS.2021.3058021]
File allegati a questo prodotto
File Dimensione Formato  
Baccarelli_Learning-in-the-Fog_2021.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 4.56 MB
Formato Adobe PDF
4.56 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1508682
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 35
  • ???jsp.display-item.citation.isi??? 26
social impact