To conquer several different tasks using only a single model, various attempts are being made in the name of continual learning. However, in the real world, when a new type of task or data emerges, it takes a considerable amount of time to collect enough data for the model to learn. Which means, when learning a new task, a problem occurs in which sufficient data required for learning is not obtained. In this study, we propose rigid Elastic Weight Consolidation(rEWC) to overcome these problems by improving Elastic Weight Consolidation(EWC), which prevents catastrophic forgetting based on weight regularization method. The proposed method showed better prediction performance for newly added tasks while showing less catastrophic forgetting than the existing EWC. Furthermore, in a situation where the number of data that can be learned decreases as the task number increases, it showed performance that overwhelms other widely known continual learning models, including EWC.
This work was supported by Korea Institute of Science and Technology Information (KISTI) (Grant Number: K24L4M2C6). This work was supported by GovernmentWide Research And Development Fund Project for Infectious Disease Research (GFID), Republic of Korea (Grant Number: HG23C1624). This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2021R1A2C2003474). This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) under the Artificial Intelligence Convergence Innovation Human Resources Development (IITP-2024-No.RS-2023-00255968) grant funded by the Korea government(MSIT). This work was supported by the Ajou University research fund.