Search papers, labs, and topics across Lattice.
2
0
5
Forget retraining: Ego personalizes VLMs on the fly by extracting and leveraging visual tokens that represent specific concepts using the model's internal attention.
Forget catastrophic forgetting: modular memory, blending in-context and in-weight learning, offers a practical path to truly continual learning agents.