Search papers, labs, and topics across Lattice.
Tongyi Lab
2
0
5
LLM agents can achieve 3x faster web search and higher accuracy by dynamically routing between multiple context management strategies.
Current content moderation benchmarks fail to capture the complexities of real-world scenarios, where posts often violate multiple policies simultaneously and moderation rules are constantly evolving, leading to inconsistent AI judgment.