# The expectation-maximization algorithm

**Todd K. Moon**

CiteWeb id: 19960000485

CiteWeb score: 1996

DOI: 10.1109/79.543975

A common task in signal processing is the estimation of the parameters of a probability distribution function. Perhaps the most frequently encountered estimation problem is the estimation of the mean of a signal in noise. In many parameter estimation problems the situation is more complicated because direct access to the data necessary to estimate the parameters is impossible, or some of the data are missing. Such difficulties arise when an outcome is a result of an accumulation of simpler outcomes, or when outcomes are clumped together, for example, in a binning or histogram operation. There may also be data dropouts or clustering in such a way that the number of underlying data points is unknown (censoring and/or truncation). The EM (expectation-maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation. The EM algorithm is presented at a level suitable for signal processing practitioners who have had some exposure to estimation theory.

Links:- condor.depaul.edu/ntomuro/courses/578/assign/finalprojfiles/samplepaper2.pdf
- courses.cs.washington.edu/courses/cse312/11wi/slides/12em.pdf
- www.georg-boecherer.de/repository/emAlgorithm.pdf
- cs.jhu.edu/~jason/465/PDFSlides/lect26-em.pdf
- www.vision.caltech.edu/html-files/EE148-2004/lectures/lecture15-EM.pdf
- www.math.chalmers.se/Stat/Grundutb/CTH/mve186/1415/em_lecture.pdf
- www.cse.unr.edu/~bebis/MathMethods/EM/lecture.pdf
- rajlab.seas.upenn.edu/pdfs/easyEM.pdf
- ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=543975
- adsabs.harvard.edu/abs/1996ISPM...13...47M

## HTML code:

## Wiki code: