A high automation degree is one of the most important features of data driven modeling tools and it should be taken into consideration in classification systems design. In this regard, constructive training algorithms are essential to improve the automation degree of a modeling system. Among neuro-fuzzy classifiers, Simpson’s Min-Max networks have the advantage to be trained in a constructive way. The use of the hyperbox, as a frame on which different membership functions can be tailored, makes the Min-Max model a flexible tool. However, the original training algorithm evidences some serious drawbacks, together with a low automation degree. In order to overcome these inconveniences, in this paper two new learning algorithms for fuzzy Min-Max neural classifiers are proposed: the Adaptive Resolution Classifier (ARC) and its pruning version (PARC). ARC/PARC generates a regularized Min-Max network by a succession of hyperbox cuts. The generalization capability of ARC/PARC technique mostly depends on the adopted cutting strategy. By using a recursive cutting procedure (R-ARC and R-PARC) it is possible to obtain better results. ARC, PARC, R-ARC and R-PARC are characterized by a high automation degree and allow to achieve networks with a remarkable generalization capability. Their performances are evaluated through a set of toy problems and real data benchmarks. We also propose a suitable index that can be used for the sensitivity analysis of the classification systems under consideration.

Adaptive resolution min-max classifiers / Rizzi, Antonello; Panella, Massimo; FRATTALE MASCIOLI, Fabio Massimo. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS. - ISSN 1045-9227. - STAMPA. - 13:2(2002), pp. 402-414. [10.1109/72.991426]

Adaptive resolution min-max classifiers

RIZZI, Antonello;PANELLA, Massimo;FRATTALE MASCIOLI, Fabio Massimo
2002

Abstract

A high automation degree is one of the most important features of data driven modeling tools and it should be taken into consideration in classification systems design. In this regard, constructive training algorithms are essential to improve the automation degree of a modeling system. Among neuro-fuzzy classifiers, Simpson’s Min-Max networks have the advantage to be trained in a constructive way. The use of the hyperbox, as a frame on which different membership functions can be tailored, makes the Min-Max model a flexible tool. However, the original training algorithm evidences some serious drawbacks, together with a low automation degree. In order to overcome these inconveniences, in this paper two new learning algorithms for fuzzy Min-Max neural classifiers are proposed: the Adaptive Resolution Classifier (ARC) and its pruning version (PARC). ARC/PARC generates a regularized Min-Max network by a succession of hyperbox cuts. The generalization capability of ARC/PARC technique mostly depends on the adopted cutting strategy. By using a recursive cutting procedure (R-ARC and R-PARC) it is possible to obtain better results. ARC, PARC, R-ARC and R-PARC are characterized by a high automation degree and allow to achieve networks with a remarkable generalization capability. Their performances are evaluated through a set of toy problems and real data benchmarks. We also propose a suitable index that can be used for the sensitivity analysis of the classification systems under consideration.
2002
adaptive resolution classifier (arc); arc; automatic training; classification; min-max model; parc; pruning adaptive resolution classifier (parc); sensitivity analysis
01 Pubblicazione su rivista::01a Articolo in rivista
Adaptive resolution min-max classifiers / Rizzi, Antonello; Panella, Massimo; FRATTALE MASCIOLI, Fabio Massimo. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS. - ISSN 1045-9227. - STAMPA. - 13:2(2002), pp. 402-414. [10.1109/72.991426]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/252719
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 102
  • ???jsp.display-item.citation.isi??? 84
social impact