Search papers, labs, and topics across Lattice.
1
12
3
8
LLMs can move beyond simple refusals to actively guide vulnerable users towards safe outcomes, achieving state-of-the-art safety and robustness against jailbreaks.