# Expectation Maximization

## Definition of Expectation Maximization

Expectation Maximization: Expectation Maximization (EM): A statistical algorithm used to find the maximum likelihood estimate of a parameter in a probabilistic model. EM iteratively maximizes the expected likelihood of the data under the model, by adjusting the model’s parameters.

## What is Expectation Maximization used for?

Expectation Maximization (EM) is a statistical algorithm used for finding the maximum likelihood estimates of parameters in probabilistic models, when the data-generating distribution is not observed directly. It is an iterative technique that alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. In simple words, EM can be used to find unknown parameters of a probability distribution or model given incomplete or partially known data. This method was first proposed by Dempster in 1977 and it has been used in many fields from statistics to machine learning, including input reconstruction in neural networks and clustering with mixture models. EM is especially attractive because it does not require derivatives of likelihood functions with respect to its parameters unlike most optimization techniques. Moreover, this algorithm is more robust than other methods such as gradient ascent due to its ability to escape local maxima and converge faster to global optima.