Search papers, labs, and topics across Lattice.
1
0
2
KL divergence can fool you: matching a teacher's predictions doesn't guarantee preserving its internal representations, but distilling with logit distance does.