Search papers, labs, and topics across Lattice.
2
0
5
11
Hallucination in abstractive summarization? Injecting named entities into the decoder input, along with multimodal embeddings, can keep your French BART model grounded.
Now a single speech foundation model can generate diverse utterance-level representations, like semantics and speaker identity, opening new possibilities for multimodal and multilingual applications.