This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Cognitive modeling of category learning and reversal learning

Nele Russwinkel
André Brechmann
Leibniz Institute for Neurobiology ~ Combinatorial NeuroImaging
Mr. Marcel Lommerzheim
Leibniz Institute for Neurobiology ~ Combinatorial NeuroImaging

During learning humans often test new hypotheses to infer causal relations between objects and actions. One very common example of learning is category learning in which humans learn to differentiate between different stimuli based on their features. The rational aspects of category learning in form of hypotheses testing need to be taken into consideration for improving computational models. Compared to reinforcement learning models that assume gradual learning, cognitive modeling allows to implement hypotheses testing and thus enabling steep transitions in learning. Here we extend our previously developed ACT-R model in a systematic way to further improve its fit to an auditory category learning and reversal learning experiment. For the initial category learning phase we optimized the model by enabling it two use two stimulus features right from the start. For improving the model's performance in the reversal phase, we introduced an additional mechanism of switching the motor-response for a given categorization. With these two changes we significantly increased the model's performance in our task. By comparing the backward learning curves of the participants to those of our model we observed that our model exhibits steep transitions during the initial category learning phase, a feature that reinforcement learning models have difficulties to reproduce.



ACT-R; Category Learning; Reversal Learning; Cognitive Modeling

There is nothing here yet. Be the first to create a thread.

Cite this as:

Russwinkel, N., Brechmann, A., & Lommerzheim, M. (2023, July). Cognitive modeling of category learning and reversal learning. Paper presented at MathPsych/ICCM/EMPG 2023. Via