Search papers, labs, and topics across Lattice.
3
0
8
7
DataFlex makes data-centric LLM training dramatically easier, unifying disparate methods for data selection, mixing, and reweighting into a single, efficient, and reproducible framework.
Sub-2-bit LLMs can now achieve state-of-the-art performance thanks to pQuant, which selectively preserves sensitive parameters in a high-precision branch during quantization-aware training.
Current multimodal agents are surprisingly bad at web browsing, achieving only 36% accuracy on a new benchmark designed to test deep, multi-modal reasoning across web pages.