Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain. One of the major challenges in scaling up CVNNs in practice is the design of complex activation functions. Recently, we proposed a novel framework for learning these activation functions neuron-wise in a data-dependent fashion, based on a cheap one-dimensional kernel expansion and the idea of kernel activation functions (KAFs). In this paper we argue that, despite its flexibility, this framework is still limited in the class of functions that can be modeled in the complex domain. We leverage the idea of widely linear complex kernels to extend the formulation, allowing for a richer expressiveness without an increase in the number of adaptable parameters. We test the resulting model on a set of complex-valued image classification benchmarks. Experimental results show that the resulting CVNNs can achieve higher accuracy while at the same time converging faster.

Widely linear kernels for complex-valued Kernel activation functions / Scardapane, S; Van Vaerenbergh, S; Comminiello, D; Uncini, A. - 2019:(2019), pp. 8528-8532. (Intervento presentato al convegno IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) tenutosi a Brighton; UK) [10.1109/ICASSP.2019.8683864].

Widely linear kernels for complex-valued Kernel activation functions

Scardapane, S;Comminiello, D;Uncini, A
2019

Abstract

Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain. One of the major challenges in scaling up CVNNs in practice is the design of complex activation functions. Recently, we proposed a novel framework for learning these activation functions neuron-wise in a data-dependent fashion, based on a cheap one-dimensional kernel expansion and the idea of kernel activation functions (KAFs). In this paper we argue that, despite its flexibility, this framework is still limited in the class of functions that can be modeled in the complex domain. We leverage the idea of widely linear complex kernels to extend the formulation, allowing for a richer expressiveness without an increase in the number of adaptable parameters. We test the resulting model on a set of complex-valued image classification benchmarks. Experimental results show that the resulting CVNNs can achieve higher accuracy while at the same time converging faster.
2019
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
complex-valued neural network; activation function; kernel method
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Widely linear kernels for complex-valued Kernel activation functions / Scardapane, S; Van Vaerenbergh, S; Comminiello, D; Uncini, A. - 2019:(2019), pp. 8528-8532. (Intervento presentato al convegno IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) tenutosi a Brighton; UK) [10.1109/ICASSP.2019.8683864].
File allegati a questo prodotto
File Dimensione Formato  
Scardapane_Widely-linear_2019.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 413.7 kB
Formato Adobe PDF
413.7 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1335704
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact