What is it about?
This book explains the asymptotic theory of robust M-estimation and many minimum distance estimators the estimating equations of which correspond to M-estimators. Subsequent applications involve identifying robust estimators through bounded and continuous psi functions, that is, with bounded influence functions. Global and local arguments are explored for estimating equations, explaining consistency and asymptotic normality for more than one root of the estimating equations, with examples. L-estimators are explored before examining trimmed likelihood estimators. Illustrations of the theory include problems involving mixtures of normal distributions and regression modeling, followed by outlier identification in multivariate analysis.
Featured Image
Why is it important?
Understanding the asymptotic theory of M-estimators is necessary for understanding how robust estimators should be implemented, and indeed defined. In many practical situations one needs to analyse data when there is potential for contamination, or incorrect specification of the model. Data indeed can be subject to errors in transcribing data or incorrect computer entry.
Perspectives
Read the Original
This page is a summary of: Wiley Series in Probability and Statistics, August 2008, Wiley,
DOI: 10.1002/9780470377994.scard.
You can read the full text:
Contributors
The following have contributed to this page