Search papers, labs, and topics across Lattice.
Sun Yat-sen University, Shenzhen Loop Area Institude
1
0
3
2
LLMs can maintain generation quality in long-context scenarios while using significantly less context, simply by adaptively allocating context based on uncertainty.