Search papers, labs, and topics across Lattice.
2
0
4
2
Diffusion models can be made more efficient and produce better outputs by dynamically allocating compute based on a learned "difficulty" signature, without any retraining.
A 2B parameter model trained on a new 1.1M dataset can now forecast remote sensing scenes better than Gemini-2.5-Flash Image, suggesting that task-specific training data and methods can beat sheer scale.