Close
This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Fast Likelihood Computation for Attentional Drift Diffusion Models and Beyond

Authors
Dr. Matthew Harrison
Brown University ~ Applied Mathematics
Mr. Sicheng Liu
Brown University ~ Applied Mathematics
Dr. Michael Frank
Brown University ~ Carney Institute for Brain Science; Department of Cognitive, Linguistic & Psychological Sciences
Alexander Fengler
Brown University ~ Cognitive Linguistic and Psychological Sciences
Abstract

Classical versions of sequential sampling models (SSMs) assume that the rate of accumulation is constant over a given trial. Empirical evidence however suggests that instead, moment by moment attention, indicated for example by eye gaze patterns, can shift the rate of accumulation such that it vacillates over the course of single trials. These dynamics are captured by models such as the attentional Drift Diffusion Model (aDDM). However, parameter inference for such models, in a way that faithfully tracks the generative process, remains a challenge. Specifically, the attention process, captured as arbitrary saccades and gaze times, forms a time-point-wise covariate which can’t be reduced to a fixed dimensional summary statistic, and thus poses a challenge even for likelihood-free methods on the research frontier. We propose a method for fast computation of likelihoods for a class of models which subsumes the aDDM. The method divides each trial into discrete time stages with fixed attention, uses fast analytical methods to assess stage-wise likelihoods and integrates these to calculate overall trial-wise likelihoods. Operationalizing this method we characterize parameter recovery in a variety of settings and compare to widely used approximations to the aDDM, which instead only use fixation proportions to maintain tractable likelihoods. We characterize the space of experiments in which such approximations may be appropriate and point out which settings drive model formulations apart. Our method will be made available to the community as a small python package, which will integrate seamlessly into a wider probabilistic programming ecosystem around the PyMC python library.

Tags

Keywords

Likelihood
Sequential Sampling Models
aDDM
DDM
Numerics
Discussion
New

There is nothing here yet. Be the first to create a thread.

Cite this as:

Harrison, M., Liu, S., Frank, M. J., & Fengler, A. (2024, June). Fast Likelihood Computation for Attentional Drift Diffusion Models and Beyond. Paper presented at Virtual MathPsych/ICCM 2024. Via mathpsych.org/presentation/1420.