Search papers, labs, and topics across Lattice.
1
0
3
22
By aligning ViT attention with automatically generated, concept-level masks, this fine-tuning method substantially boosts robustness to distribution shifts, outperforming standard regularization techniques.