Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models

Søren Johansen, Bent Nielsen

    31 Citations (Scopus)

    Abstract

    Outlier detection algorithms are intimately connected with robust statistics that down-weight some observations to zero. We define a number of outlier detection algorithms related to the Huber-skip and least trimmed squares estimators, including the one-step Huber-skip estimator and the forward search. Next, we review a recently developed asymptotic theory of these. Finally, we analyse the gauge, the fraction of wrongly detected outliers, for a number of outlier detection algorithms and establish an asymptotic normal and a Poisson theory for the gauge.

    Original languageEnglish
    JournalScandinavian Journal of Statistics
    Volume43
    Issue number2
    Pages (from-to)321-348
    Number of pages28
    ISSN0303-6898
    DOIs
    Publication statusPublished - 1 Jun 2016

    Cite this