Search papers, labs, and topics across Lattice.
7
1
14
3
Industrial code generation gets a reasoning boost: InCoder-32B-Thinking leverages error-driven feedback and a code world model to achieve top-tier performance on complex hardware-aware tasks.
Achieve better compression in low-bit quantization by considering not just numerical sensitivity, but also the structural role of each layer.
A new 32B code LLM trained specifically for industrial tasks crushes existing models on specialized domains like chip design and GPU kernel optimization, while remaining competitive on general coding benchmarks.
Code LLMs can achieve SOTA performance in agentic tasks by explicitly modeling the dynamic evolution of software logic across different training stages.
Self-supervision unlocks robust LiDAR global localization by learning scene-specific landmarks from BEV images, outperforming scene-agnostic methods.
Untangling the mess of "streaming LLMs," this paper delivers a clear taxonomy that distinguishes between streaming generation, streaming inputs, and interactive architectures.
A 32B model trained entirely on synthetic data from InfTool outperforms models 10x larger on tool use, rivaling even Claude-Opus.