Fast Likelihood Computation for Attentional Drift Diffusion Models and Beyond
Classical versions of sequential sampling models (SSMs) assume that the rate of accumulation is constant over a given trial. Empirical evidence however suggests that instead, moment by moment attention, indicated for example by eye gaze patterns, can shift the rate of accumulation such that it vacillates over the course of single trials. These dynamics are captured by models such as the attentional Drift Diffusion Model (aDDM). However, parameter inference for such models, in a way that faithfully tracks the generative process, remains a challenge. Specifically, the attention process, captured as arbitrary saccades and gaze times, forms a time-point-wise covariate which can’t be reduced to a fixed dimensional summary statistic, and thus poses a challenge even for likelihood-free methods on the research frontier. We propose a method for fast computation of likelihoods for a class of models which subsumes the aDDM. The method divides each trial into discrete time stages with fixed attention, uses fast analytical methods to assess stage-wise likelihoods and integrates these to calculate overall trial-wise likelihoods. Operationalizing this method we characterize parameter recovery in a variety of settings and compare to widely used approximations to the aDDM, which instead only use fixation proportions to maintain tractable likelihoods. We characterize the space of experiments in which such approximations may be appropriate and point out which settings drive model formulations apart. Our method will be made available to the community as a small python package, which will integrate seamlessly into a wider probabilistic programming ecosystem around the PyMC python library.
Keywords
There is nothing here yet. Be the first to create a thread.
Cite this as:
Harrison, M.,