Non parametric inference error, the error arising from estimating the regression function based on a labeled set of training examples could be divided into two main contributions: the bias and the variance. Neural network is one of the existing models in non parametric inference whose bias/variance trade off is hidden below the network architecture. In recent years new and powerful tools for neural networks selection were invented to impact the bias variance dilemma and the results in the implemented solution were satisfying [11, 12]. We exploited the new measures introduced in these works for implementing a genetic algorithm to train neural networks. This method enables a reliable generalization error estimation for neural model. Estimating the error performance permits to drive correctly the genetic evolution that will lead to a fitting model with the desired characteristics. After a brief description of the estimation technique we used the genetic algorithm implementation for artificial data as a test. Finally the results of the fully automatic algorithm for NN training and model selection applied to investigation of defect structure of semi-insulating materials based on photo-induced transient spectroscopy experiments.

Evolutionary Selection of neural networks satisfying leave-one-out criteria / Nardinocchi, G; Jankowski, S; Balsi, Marco. - 6159:(2005), pp. 729-738. (Intervento presentato al convegno SPIE XVI Joint Symposium on Photonics and Web Engineering tenutosi a Wilga; Poland nel 30 May – 5 June 2005) [10.1117/12.674862].

Evolutionary Selection of neural networks satisfying leave-one-out criteria

BALSI, Marco
2005

Abstract

Non parametric inference error, the error arising from estimating the regression function based on a labeled set of training examples could be divided into two main contributions: the bias and the variance. Neural network is one of the existing models in non parametric inference whose bias/variance trade off is hidden below the network architecture. In recent years new and powerful tools for neural networks selection were invented to impact the bias variance dilemma and the results in the implemented solution were satisfying [11, 12]. We exploited the new measures introduced in these works for implementing a genetic algorithm to train neural networks. This method enables a reliable generalization error estimation for neural model. Estimating the error performance permits to drive correctly the genetic evolution that will lead to a fitting model with the desired characteristics. After a brief description of the estimation technique we used the genetic algorithm implementation for artificial data as a test. Finally the results of the fully automatic algorithm for NN training and model selection applied to investigation of defect structure of semi-insulating materials based on photo-induced transient spectroscopy experiments.
2005
SPIE XVI Joint Symposium on Photonics and Web Engineering
neural networks; virtual leave-one-out; genetic algorithm
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Evolutionary Selection of neural networks satisfying leave-one-out criteria / Nardinocchi, G; Jankowski, S; Balsi, Marco. - 6159:(2005), pp. 729-738. (Intervento presentato al convegno SPIE XVI Joint Symposium on Photonics and Web Engineering tenutosi a Wilga; Poland nel 30 May – 5 June 2005) [10.1117/12.674862].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/208092
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact