Search papers, labs, and topics across Lattice.
1
0
3
15
Ditch quadratic attention in your ViTs without sacrificing performance: ViT-AdaLA distills knowledge from pre-trained VFMs into linear attention architectures, achieving state-of-the-art results on classification and segmentation.