Search papers, labs, and topics across Lattice.
1
0
3
2
On-policy distillation makes language models more accurate, but also dangerously overconfident, revealing a fundamental tension between capability and calibration.