Undergraduate thesis project focusing on Parameter Isolation methods to achieve Continual Learning on DNNs
-
Updated
Nov 30, 2024 - Jupyter Notebook
Undergraduate thesis project focusing on Parameter Isolation methods to achieve Continual Learning on DNNs
Focuses on parameter isolation methods for continual learning, where each task uses separate parameter masks or subnetworks to prevent forgetting. Implements Hard Attention to the Task (HAT), Supermask Superposition (SupSup), and Piggyback, with visualization tools and metrics for task overlap and capacity usage.
Add a description, image, and links to the parameter-isolation topic page so that developers can more easily learn about it.
To associate your repository with the parameter-isolation topic, visit your repo's landing page and select "manage topics."