What is Gaussian Mixed Model (GMM)?
A Gaussian Mixture Model (GMM) is a category of probability models that states that all data points generated are derived from a mixture of finite Gaussian distributions that have no known parameters. The parameters for composite Gaussian models are derived from either a posterior maximum estimate or an iterative expectation maximizing algorithm from a previous model that is well trained. Gaussian mixture models are very useful when it comes to modeling data, especially data that comes from multiple groups.
Mathematically, Gaussian mixture models are an example of a parametric probability density function that can be represented as a weighted sum of all densities of Gaussian components. In other words, the weighted sum of Gaussian M component densities is known as a Gaussian mixture model and mathematically it is p (x | λ) = XM i = 1 wi g (x | µ i,, ; i), where M is for mixture weights, x is the continuous-valued data vector of the D dimension and and g (x | µi, ,i) is the component Gaussian densities. A Gaussian mixture model consists of covariance matrices, mixture weights and mean value vectors from each available component density. Gaussians are able to fully model the correlations of feature vector elements thanks to the linear combination of diagonal covariance bases.
Gaussian mixture models are used in biometric systems where the parametric model helps in understanding the features or measurements related to such as voice spectrum spectral features. Gaussian mixture models are also used for density estimation and are considered to be the most statistically sophisticated techniques for clustering.