Search papers, labs, and topics across Lattice.
4
0
8
6
Attention Sink, where Transformers fixate on seemingly irrelevant tokens, is more than just a quirk – it's a fundamental challenge impacting training, inference, and even causing hallucinations, demanding a systematic approach to understanding and mitigating its effects.
Leaderboard-topping video models are still surprisingly brittle, failing on basic video reasoning tasks unless given the right textual cues.
By tightly coupling reasoning, searching, and generation, Unify-Agent demonstrates that agent-based modeling can substantially improve world knowledge grounding in image synthesis, rivaling closed-source models.
Forget static, single-turn personalization – PersonaVLM unlocks long-term, evolving user alignment in MLLMs, even surpassing GPT-4o.