Search papers, labs, and topics across Lattice.
2
0
5
2
LMMs can slash FLOPs by 89% without sacrificing accuracy, thanks to a frequency-modulated visual restoration technique that preserves crucial visual semantics even with fewer tokens.
LLMs can reason better if you force them to explore *different* ways of being right, not just be more random.