Search papers, labs, and topics across Lattice.
2
0
5
Forget gradient descent: this new method routes transformer activations through a Hopfield-inspired memory in a single forward pass to achieve state-of-the-art online continual learning.
Forget replaying old data – this continual learning method dreams up entirely new classes to train on, boosting forward transfer and outperforming standard rehearsal techniques.