Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models

Søren Johansen, Bent Nielsen

    31 Citationer (Scopus)

    Abstract

    Outlier detection algorithms are intimately connected with robust statistics that down-weight some observations to zero. We define a number of outlier detection algorithms related to the Huber-skip and least trimmed squares estimators, including the one-step Huber-skip estimator and the forward search. Next, we review a recently developed asymptotic theory of these. Finally, we analyse the gauge, the fraction of wrongly detected outliers, for a number of outlier detection algorithms and establish an asymptotic normal and a Poisson theory for the gauge.

    OriginalsprogEngelsk
    TidsskriftScandinavian Journal of Statistics
    Vol/bind43
    Udgave nummer2
    Sider (fra-til)321-348
    Antal sider28
    ISSN0303-6898
    DOI
    StatusUdgivet - 1 jun. 2016

    Citationsformater