Search papers, labs, and topics across Lattice.
1
0
3
Hallucinations in VLMs aren't just errors, they're traceable pathologies in a model's cognitive trajectory, diagnosable via geometric anomalies in a learned cognitive state space.