Search papers, labs, and topics across Lattice.
1
0
3
6
LLM safety crumbles in low-resource languages because alignment is skin-deep; LASA fixes this by injecting safety at the semantic core, slashing attack success by 88%.