The EM algorithm is used to find maximum likelihood estimates for problems with latent variables. It works by alternating between an E-step (computing expected values of the latent variables) and an M-step (maximizing the likelihood with respect to the parameters). For mixture of Gaussians, the E-step computes the posterior probabilities that each data point belongs to each component. The M-step then updates the mixture weights, means, and covariances by taking weighted averages/sums of the data using these posteriors.