The attention-weighted sample-size model of visual short-term memory: Attention capture predicts resource allocation and memory load

Philip L. Smith*, Simon D. Lilburn, Elaine A. Corbett, David K. Sewell, Søren Kyllingsbæk

*Corresponding author af dette arbejde
    20 Citationer (Scopus)

    Abstract

    We investigated the capacity of visual short-term memory (VSTM) in a phase discrimination task that required judgments about the configural relations between pairs of black and white features. Sewell et al. (2014) previously showed that VSTM capacity in an orientation discrimination task was well described by a sample-size model, which views VSTM as a resource comprised of a finite number of noisy stimulus samples. The model predicts the invariance of ∑i(di ′)2, the sum of squared sensitivities across items, for displays of different sizes. For phase discrimination, the set-size effect significantly exceeded that predicted by the sample-size model for both simultaneously and sequentially presented stimuli. Instead, the set-size effect and the serial position curves with sequential presentation were predicted by an attention-weighted version of the sample-size model, which assumes that one of the items in the display captures attention and receives a disproportionate share of resources. The choice probabilities and response time distributions from the task were well described by a diffusion decision model in which the drift rates embodied the assumptions of the attention-weighted sample-size model.
    OriginalsprogEngelsk
    TidsskriftCognitive Psychology
    Vol/bind89
    Sider (fra-til)71-105
    Antal sider35
    ISSN0010-0285
    DOI
    StatusUdgivet - 1 sep. 2016

    Fingeraftryk

    Dyk ned i forskningsemnerne om 'The attention-weighted sample-size model of visual short-term memory: Attention capture predicts resource allocation and memory load'. Sammen danner de et unikt fingeraftryk.

    Citationsformater