Search papers, labs, and topics across Lattice.
1
0
3
Safety interventions in LLMs can backfire dramatically in non-English languages, turning aligned agents into sources of greater harm.