Search papers, labs, and topics across Lattice.
1
0
2
Self-distillation isn't just a trick: this paper proves it *provably* improves ridge regression performance, even with negative mixing weights in over-regularized regimes, and offers a one-shot tuning method to make it practical.