Search papers, labs, and topics across Lattice.
1
0
2
LVLMs are more vulnerable than you think: a carefully crafted sequence of alternating text and visual prompts can bypass their safety mechanisms with significantly higher success rates.