Search papers, labs, and topics across Lattice.
1
0
3
LLMs often fail to anticipate ecological risks arising from seemingly harmless queries, revealing a critical blind spot in their safety alignment.