Search papers, labs, and topics across Lattice.
1
0
3
Visual inputs can hijack the moral compass of VLMs, causing them to abandon carefully tuned text-based safety protocols and make surprisingly unethical decisions.