A comparison of the L2 minimum distance estimator and the EM-algorithm when fitting k-component univariate normal mixtures
Clarke, B.R., Davidson, T. and Hammarstrand, R. (2016) A comparison of the L2 minimum distance estimator and the EM-algorithm when fitting k-component univariate normal mixtures. Statistical Papers, In press .
*Subscription may be required
The method of maximum likelihood using the EM-algorithm for fitting finite mixtures of normal distributions is the accepted method of estimation ever since it has been shown to be superior to the method of moments. Recent books testify to this. There has however been criticism of the method of maximum likelihood for this problem, the main criticism being when the variances of component distributions are unequal the likelihood is in fact unbounded and there can be multiple local maxima. Another major criticism is that the maximum likelihood estimator is not robust. Several alternative minimum distance estimators have since been proposed as a way of dealing with the first problem. This paper deals with one of these estimators which is not only superior due to its robustness, but in fact can have an advantage in numerical studies even at the model distribution. Importantly, robust alternatives of the EM-algorithm, ostensibly fitting t distributions when in fact the data are mixtures of normals, are also not competitive at the normal mixture model when compared to the chosen minimum distance estimator. It is argued for instance that natural processes should lead to mixtures whose component distributions are normal as a result of the Central Limit Theorem. On the other hand data can be contaminated because of extraneous sources as are typically assumed in robustness studies. This calls for a robust estimator
|Publication Type:||Journal Article|
|Murdoch Affiliation:||School of Engineering and Information Technology|
|Copyright:||© 2016 Springer-Verlag Berlin Heidelberg|
|Item Control Page|