Search papers, labs, and topics across Lattice.
1
0
3
LLM jailbreaking isn't just about prompts, but also about the hidden battle between a model's urge to complete a thought and its safety training.