Search papers, labs, and topics across Lattice.
1
0
3
5
LVLMs can be subtly backdoored with manipulated images, allowing attackers to inject targeted messages into multi-turn conversations and manipulate users after a specific trigger.