Search papers, labs, and topics across Lattice.
1
0
2
5
Parameter importance isn't forever: dynamically adapting which parameters are frozen during fine-tuning significantly improves generalization and reduces forgetting in LLMs.