Decomposable (probabilistic) models are log-linear models generated by acyclic hypergraphs, and a number of nice properties enjoyed by them are known. In many applications the following selection problem naturally arises: given a probability distribution p over a finite set V of n discrete variables and a positive integer k, find a decomposable model with tree-width k that best fits p. If H is the generating hypergraph of a decomposable model and p(H) is the estimate of p under the model, we can measure the closeness of p(H) to p by the information divergence D(p : p(H)), so that the problem above reads: given p and k, find an acyclic, connected hypergraph H of tree-width k such that D(p : pH) is minimum. It is well-known that this problem is NP-hard. However, for k = 1 it was solved by Chow and Liu in a very efficient way; thus, starting from an optimal Chow-Liu solution, a few forward-selection procedures have been proposed with the aim at finding a 'good' solution for an arbitrary k. We propose a backward-selection procedure which starts from the (trivial) optimal solution for k = n - 1, and we show that, in a study case taken from literature, our procedure succeeds in finding an optimal solution for every k.

A BACKWARD SELECTION PROCEDURE FOR APPROXIMATING A DISCRETE PROBABILITY DISTRIBUTION BY DECOMPOSABLE MODELS / Malvestuto, Francesco Mario. - In: KYBERNETIKA. - ISSN 0023-5954. - STAMPA. - 48:(2012), pp. 825-844.

A BACKWARD SELECTION PROCEDURE FOR APPROXIMATING A DISCRETE PROBABILITY DISTRIBUTION BY DECOMPOSABLE MODELS

MALVESTUTO, Francesco Mario
2012

Abstract

Decomposable (probabilistic) models are log-linear models generated by acyclic hypergraphs, and a number of nice properties enjoyed by them are known. In many applications the following selection problem naturally arises: given a probability distribution p over a finite set V of n discrete variables and a positive integer k, find a decomposable model with tree-width k that best fits p. If H is the generating hypergraph of a decomposable model and p(H) is the estimate of p under the model, we can measure the closeness of p(H) to p by the information divergence D(p : p(H)), so that the problem above reads: given p and k, find an acyclic, connected hypergraph H of tree-width k such that D(p : pH) is minimum. It is well-known that this problem is NP-hard. However, for k = 1 it was solved by Chow and Liu in a very efficient way; thus, starting from an optimal Chow-Liu solution, a few forward-selection procedures have been proposed with the aim at finding a 'good' solution for an arbitrary k. We propose a backward-selection procedure which starts from the (trivial) optimal solution for k = n - 1, and we show that, in a study case taken from literature, our procedure succeeds in finding an optimal solution for every k.
2012
acyclic hypergraph; backward selection; decomposable model; information divergence; k-hypertree
01 Pubblicazione su rivista::01a Articolo in rivista
A BACKWARD SELECTION PROCEDURE FOR APPROXIMATING A DISCRETE PROBABILITY DISTRIBUTION BY DECOMPOSABLE MODELS / Malvestuto, Francesco Mario. - In: KYBERNETIKA. - ISSN 0023-5954. - STAMPA. - 48:(2012), pp. 825-844.
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/496945
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 5
social impact