Search papers, labs, and topics across Lattice.
1
0
3
LLMs can refuse a harmful request in text while *simultaneously* executing that same forbidden action via a tool call, revealing a dangerous gap between text-level and action-level safety.