Search papers, labs, and topics across Lattice.
1
0
3
Forget catastrophic forgetting: sparse memory finetuning, enhanced with a KL-divergence-based update rule, lets LLMs learn continuously without trashing old knowledge.