Search papers, labs, and topics across Lattice.
3
0
4
11
LLMs can now perform inference without ever seeing raw text, opening the door to privacy-preserving applications without sacrificing performance.
Multilingual retrievers often prioritize irrelevant English documents over relevant foreign-language documents, even when the query is in that foreign language.
Forget just mining hard negatives: the secret to better knowledge distillation for retrieval lies in matching the *entire* score distribution of your teacher model.