Search papers, labs, and topics across Lattice.
1
4
3
17
Quantifying uncertainty in LLMs offers a promising path to detecting and mitigating hallucinations, but current methods still face significant limitations.