Search papers, labs, and topics across Lattice.
1
0
3
Reasoning VLMs suffer from "attention pulses" – sporadic and unfocused attention to images during chain-of-thought reasoning – but can be fixed with a simple inference-time attention gating strategy.