Search papers, labs, and topics across Lattice.
2
0
6
3
Forget expensive audio-text data collection: TASU2 lets you dial in the perfect amount of noise for training your speech LLM, all from text.
Forget static attention allocation – Flux Attention dynamically routes layers between full and sparse attention based on context, delivering significant speedups without sacrificing performance in long-context LLMs.