Search papers, labs, and topics across Lattice.
3
0
8
0
Achieve near-zero FLOPs and faster time-to-first-token by treating cached documents as immutable packets, eliminating the need for KV recomputation in LLMs.
Current image quality metrics struggle to articulate *why* one high-quality image is better than another, but this challenge shows MLLMs are closing the gap by providing expert-level explanations.
MLLMs can be tricked into missing 90% of harmful content simply by encoding it in images that humans can easily read.