In this chapter, we present three different recurrent neural network architectures that we employ for the prediction of real-valued time series. All the models reviewed in this chapter can be trained through the previously discussed backpropagation through time procedure. First, we present the most basic version of recurrent neural networks, called Elman recurrent neural network. Then, we introduce two popular gated architectures, which are long short-term memory and the gated recurrent units. We discuss the main advantages of these more sophisticated architectures, especially regarding their capability to process much longer dependencies in time by maintaining an internal memory for longer periods. For each one of the reviewed network, we provide the details and we show the equations for updating the internal state and computing the output at each time step. Then, for each recurrent neural network we also provide a quick overview of its main applications in previous works in the context of real-valued time series forecasting.

Recurrent neural network architectures / Bianchi, Filippo Maria; Maiorino, Enrico; Kampffmeyer, Michael C.; Rizzi, Antonello; Jenssen, Robert. - (2017), pp. 15-20. - SPRINGERBRIEFS IN COMPUTER SCIENCE. [10.1007/978-3-319-70338-1_3].

Recurrent neural network architectures

Bianchi, Filippo Maria;Maiorino, Enrico;Rizzi, Antonello;
2017

Abstract

In this chapter, we present three different recurrent neural network architectures that we employ for the prediction of real-valued time series. All the models reviewed in this chapter can be trained through the previously discussed backpropagation through time procedure. First, we present the most basic version of recurrent neural networks, called Elman recurrent neural network. Then, we introduce two popular gated architectures, which are long short-term memory and the gated recurrent units. We discuss the main advantages of these more sophisticated architectures, especially regarding their capability to process much longer dependencies in time by maintaining an internal memory for longer periods. For each one of the reviewed network, we provide the details and we show the equations for updating the internal state and computing the output at each time step. Then, for each recurrent neural network we also provide a quick overview of its main applications in previous works in the context of real-valued time series forecasting.
2017
Recurrent Neural Networks for Short-Term Load Forecasting. An Overview and Comparative Analysis
978-3-319-70337-4
978-3-319-70338-1
Elman recurrent neural network; Gated architectures; Gated recurrent unit; Long short-term memory; Time series prediction applications; Computer Science (all)
02 Pubblicazione su volume::02a Capitolo o Articolo
Recurrent neural network architectures / Bianchi, Filippo Maria; Maiorino, Enrico; Kampffmeyer, Michael C.; Rizzi, Antonello; Jenssen, Robert. - (2017), pp. 15-20. - SPRINGERBRIEFS IN COMPUTER SCIENCE. [10.1007/978-3-319-70338-1_3].
File allegati a questo prodotto
File Dimensione Formato  
Bianchi_Recurrent-neural_2017.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 199.61 kB
Formato Adobe PDF
199.61 kB Adobe PDF   Contatta l'autore
Bianchi_Recurrent_Frontespizio-colophon-indice_2017.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.48 MB
Formato Adobe PDF
1.48 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1119350
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 32
  • ???jsp.display-item.citation.isi??? ND
social impact