Close
This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Investigating memory reactivation in neural networks: measure and compare exact and generative replay

Authors
Dr. Yihe Lu
University of Edinburgh ~ School of Informatics
Abstract

Memory reactivation can be observed during sleep or wakefulness in human and rodent brains, and is believed to be crucial for memory consolidation (Lewis and Bendor, 2019). A similar strategy, namely rehearsal or replay, is proven to be effective in mitigating, or even overcoming the catastrophic forgetting problem in neural network (NN) modelling and applications (Robins, 1995; Kumaran and McClelland, 2012). Generative replay (GR) (van de Ven, Siegelmann and Tolias, 2020) and experience replay (ER) (Káli and Dayan, 2004) are the two common replay strategies. While GR produces replay samples from random activations in a generative NN, ER revisits exact copies of past training samples preserved in memory storage. Although ER (without memory limits) yields better results and is thus deployed more in applications (e.g., machine learning), GR is computationally more efficient and biologically more plausible. In this study we chose restricted Boltzmann machines (RBMs) as our primary NN model. In addition to ER and GR, we consider a new strategy cued generative replay (cGR), which uses replay cues that are partially correct activations rather than completely random activations in standard GR. We propose two indices, evenness and exactness to measure the quality of replay samples. GR, in contrast to ER, yielded more balanced but less accurate replay (high evenness, low exactness), but their performance was largely dependent on the replay amount. We found that cGR could outperform both by improving replay quality.

Tags

Keywords

restricted Boltzmann machine
memory replay
continual learning
Discussion
New

There is nothing here yet. Be the first to create a thread.

Cite this as:

Lu, Y. (2021, July). Investigating memory reactivation in neural networks: measure and compare exact and generative replay. Paper presented at Virtual MathPsych/ICCM 2021. Via mathpsych.org/presentation/488.