Search papers, labs, and topics across Lattice.
3
0
6
11
Sparsity, often viewed as a means for efficiency, actually unlocks deeper, more effective LLMs by taming variance and boosting layer utilization.
DLMs aren't truly parallel because their training data is too sequential, but NAP shows how data curation can unlock genuine parallel decoding and boost reasoning performance.
An open-source ecosystem for agentic learning, complete with a trained agent and novel policy optimization, promises to accelerate research by providing a standardized, scalable platform.