Search papers, labs, and topics across Lattice.
1
0
3
LLMs are shockingly susceptible to generating fake news under jailbreak attacks, especially when it comes to English and U.S.-related topics, exposing a dangerous safety imbalance.