Search papers, labs, and topics across Lattice.
5
0
11
A billion-scale SAR foundation model closes the domain generalization gap in SAR imagery by 10% mIoU, thanks to a physics-guided MoE architecture.
Forget billion-scale datasets: EvoTok achieves state-of-the-art image tokenization for both understanding and generation using a residual evolution process trained on just 13M images.
AutoAgent dynamically evolves agent cognition and memory to achieve superior performance in complex, dynamic environments, without requiring external retraining.
FineRMoE achieves 6x higher parameter efficiency, 281x lower prefill latency, and 136x higher decoding throughput compared to strong baselines, demonstrating a significant leap in MoE performance.
GraphRAG gets a speed boost: HELP achieves up to 28.8x faster retrieval while maintaining competitive accuracy on multi-hop question answering by intelligently mapping logical reasoning paths to relevant text.