Search papers, labs, and topics across Lattice.
3
0
7
0
Forget scaling depth and width—MOUE unlocks a new "virtual width" dimension for Mixture-of-Experts by cleverly reusing a single expert pool across layers.
LLMs can go beyond vulnerability verification to proactively discover zero-day exploits, achieving a 41.8% improvement in detection rate while reducing false positives by 28.3%.
A 1.7B parameter model can now rival much larger audio language models, thanks to a novel architecture and data synthesis pipeline.