Search papers, labs, and topics across Lattice.
1
0
3
LLM-controlled robots are surprisingly vulnerable: a single compromised input can cascade through the system, bypassing safety measures and leading to dangerous physical actions.