Statistical inference with mixtures of normal components with unequal variances can be a challenging task, as the likelihood function is unbounded. We provide a new solution by maximizing a marginal likelihood corresponding to the maximal invariant under the group of location and scale transformations. This likelihood is bounded but cannot be calculated explicitly. We start from the simplest case, a two component normal mixture model and we write its likelihood as function of only three parameters, not five. As an innovation in simulated likelihood methodology, we show that if one uses importance sampling to simulate the likelihood, one can directly use an EM algorithm on the objective function. We also offer an innovation in our importance sampling strategy by using the output of an MCMC algorithm whose stationary distribution is the target distribution. The empirical distribution of this realization becomes a seed distribution. The last one is then turned into a continuous kernel density estimator for the target distribution by using as the density kernel the Gibbs transition kernel. The effectiveness of these proposals is shown by conducting a simulation study.
Simulated Likelihood for Heteroscedastic Gaussian Mixture Models / Ranalli, Monia; Lindsay, Bruce G.. - (2014). (Intervento presentato al convegno JSM 2014 tenutosi a Boston).
Simulated Likelihood for Heteroscedastic Gaussian Mixture Models
Monia Ranalli;
2014
Abstract
Statistical inference with mixtures of normal components with unequal variances can be a challenging task, as the likelihood function is unbounded. We provide a new solution by maximizing a marginal likelihood corresponding to the maximal invariant under the group of location and scale transformations. This likelihood is bounded but cannot be calculated explicitly. We start from the simplest case, a two component normal mixture model and we write its likelihood as function of only three parameters, not five. As an innovation in simulated likelihood methodology, we show that if one uses importance sampling to simulate the likelihood, one can directly use an EM algorithm on the objective function. We also offer an innovation in our importance sampling strategy by using the output of an MCMC algorithm whose stationary distribution is the target distribution. The empirical distribution of this realization becomes a seed distribution. The last one is then turned into a continuous kernel density estimator for the target distribution by using as the density kernel the Gibbs transition kernel. The effectiveness of these proposals is shown by conducting a simulation study.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.