On Convergence Properties of the EM Algorithm for Gaussian Mixtures
Author(s)
Jordan, Michael; Xu, Lei
DownloadAIM-1520.ps (284.8Kb)
Additional downloads
Metadata
Show full item recordAbstract
"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.
Date issued
1995-04-21Other identifiers
AIM-1520
CBCL-111
Series/Report no.
AIM-1520CBCL-111
Keywords
learning, neural networks, EM algorithm, clustering, mixture models, statistics