Search papers, labs, and topics across Lattice.
Tencent Hunyuan
1
0
3
3
Safety-aligned LLMs are so consistently risk-averse that a single, transferable "poison" document can now block up to 96% of queries across different RAG systems, even without access to the target model.