Search papers, labs, and topics across Lattice.
University of Electronic Science and Technology of China
1
0
3
Diffusion Transformers can be accelerated by up to 7x with nearly lossless performance using a training-free method that selectively computes on sparse anchor tokens, outperforming existing temporal acceleration techniques.