Search papers, labs, and topics across Lattice.
Sorbonne Université, CNRS, Sorbonne Université, CNRS, ISIR
2
0
3
1
Forget bigger models: smarter training objectives like pairwise MarginMSE and listwise InfoNCE can boost cross-encoder performance as much as scaling the backbone architecture.
Cross-encoders can be made 4x faster with minimal performance loss by surgically removing interactions, creating a "minimal interaction" architecture (MICE) that rivals late-interaction models in speed and surpasses them in generalization.