Close
This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Stochastic sampling provides a unifying perspective on working memory limits

Authors
Dr. Paul Bays
University of Cambridge ~ Psychology
Abstract

Recent debate regarding the limits of working memory has focused on whether memory resources are better characterized as discrete or continuous, with models of each type competing to best capture the errors humans make in recall. I will argue that this apparent dichotomy is largely illusory, and that the critical distinction is instead between deterministic and stochastic mechanisms of WM, with only the latter being compatible with observed human performance and the underlying biological system. I will show that reconceptualizing existing models in terms of sampling reveals strong commonalities between supposedly opposing accounts. A probabilistic limit on how many items can be successfully recalled from WM is an emergent property of continuous models, despite these models having no explicit mechanism to enforce such a limit. Furthermore, adding stochasticity in the number of samples to a discrete model puts its ability to describe behaviour on a par with continuous models. Finally, stochastic sampling has a theoretical connection with biologically plausible implementations of WM based on the inherently stochastic activity of neural populations.

Tags

Keywords

Discussion
New

There is nothing here yet. Be the first to create a thread.

Cite this as:

Bays, P. (2021, July). Stochastic sampling provides a unifying perspective on working memory limits. Paper presented at Virtual MathPsych/ICCM 2021. Via mathpsych.org/presentation/586.