Search papers, labs, and topics across Lattice.
1
0
3
Even without adversarial attacks, today's best LLMs fail basic safety sanity checks on information and access security, revealing critical risks for real-world deployment.