Search papers, labs, and topics across Lattice.
1
0
2
Maliciously crafted tools can hijack LLM agents into "overthinking loops," inflating token costs by over 100x without triggering typical safety filters.