AFEC: Active Forgetting of Negative Transfer in Continual Learning

Liyuan Wang School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University; Tsinghua-Peking Center for Life Sciences; Dept. of Comp. Sci. & Tech., Institute for AI, BNRist Center, THBI Lab, Tsinghua University Mingtian Zhang AI Center, University College London Zhongfan Jia IIIS, Tsinghua University Qian Li School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University; Tsinghua-Peking Center for Life Sciences Chenglong Bao Yau Mathematical Sciences Center, Tsinghua University Kaisheng Ma IIIS, Tsinghua University Jun Zhu ept. of Comp. Sci. & Tech., Institute for AI, BNRist Center, THBI Lab, Tsinghua University Yi Zhong School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University; Tsinghua-Peking Center for Life Sciences

TBD mathscidoc:2206.43013

NeurIPS, 2021.4
Continual learning aims to learn a sequence of tasks from dynamic data distributions. Without accessing to the old training samples, knowledge transfer from the old tasks to each new task is difficult to determine, which might be either positive or negative. If the old knowledge interferes with the learning of a new task, i.e., the forward knowledge transfer is negative, then precisely remembering the old tasks will further aggravate the interference, thus decreasing the performance of continual learning. By contrast, biological neural networks can actively forget the old knowledge that conflicts with the learning of a new experience, through regulating the learning-triggered synaptic expansion and synaptic convergence. Inspired by the biological active forgetting, we propose to actively forget the old knowledge that limits the learning of new tasks to benefit continual learning. Under the framework of Bayesian continual learning, we develop a novel approach named Active Forgetting with synaptic Expansion-Convergence (AFEC). Our method dynamically expands parameters to learn each new task and then selectively combines them, which is formally consistent with the underlying mechanism of biological active forgetting. We extensively evaluate AFEC on a variety of continual learning benchmarks, including CIFAR-10 regression tasks, visual classification tasks and Atari reinforcement tasks, where AFEC effectively improves the learning of new tasks and achieves the state-of-the-art performance in a plug-and-play way.
No keywords uploaded!
[ Download ] [ 2022-06-15 22:02:56 uploaded by Baocl ] [ 547 downloads ] [ 0 comments ]
@inproceedings{liyuan2021afec:,
  title={AFEC: Active Forgetting of Negative Transfer in Continual Learning},
  author={Liyuan Wang, Mingtian Zhang, Zhongfan Jia, Qian Li, Chenglong Bao, Kaisheng Ma, Jun Zhu, and Yi Zhong},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20220615220256922947369},
  booktitle={NeurIPS},
  year={2021},
}
Liyuan Wang, Mingtian Zhang, Zhongfan Jia, Qian Li, Chenglong Bao, Kaisheng Ma, Jun Zhu, and Yi Zhong. AFEC: Active Forgetting of Negative Transfer in Continual Learning. 2021. In NeurIPS. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20220615220256922947369.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved