Search papers, labs, and topics across Lattice.
3
0
7
Give your memory-less VLA policy a brain: TempoFit retrofits temporal context by cleverly reusing existing attention keys and values, boosting long-horizon task success without retraining or adding latency.
Human eye-tracking data can significantly boost LLM code summarization performance, improving BLEU-4 scores by over 13% via a lightweight attention module that distills gaze patterns into learned priors.
By tuning step sizes based on the spectral properties of gradients, SpecMuon offers a more stable and faster alternative to Adam and Muon for training physics-informed neural networks.