Continual Learning with Pre-Trained Models

Preview

Nowadays, real-world applications often face streaming data, which requires the learning system to absorb new knowledge as data evolves. Continual Learning (CL) aims to achieve this goal and meanwhile overcome the catastrophic forgetting of former knowledge when learning new ones. Typical CL methods build the model from scratch to grow with incoming data. However, the advent of the pre-trained model (PTM) era has sparked immense research interest, particularly in leveraging PTMs’ robust representational capabilities for CL. This paper presents a comprehensive survey of the latest advancements in PTM-based CL.

Read the full paper here:
Continual Learning with Pre-Trained Models

Previous
Previous

Machine Learning for Synthetic Data Generation:A Review

Next
Next

The AI Calculation Debate