Search papers, labs, and topics across Lattice.
1
2
3
LLMs, even those with tens of billions of parameters, are surprisingly susceptible to jailbreaking and generating incoherent outputs through simple special character adversarial attacks.