Search papers, labs, and topics across Lattice.
2
5
5
21
KDFlow unlocks 6x faster LLM knowledge distillation by decoupling student training and teacher inference, using hidden state transfer to minimize communication overhead.
Token-level alignment, powered by a novel distillation approach, lets LLMs learn faster and better by avoiding the pitfalls of response-level reward optimization.