Search papers, labs, and topics across Lattice.
2
0
4
Diffusion models can now predict driver attention with state-of-the-art accuracy by incorporating LLM-enhanced semantic reasoning.
VecFormer slashes the computational cost of graph transformers while boosting out-of-distribution generalization by operating attention on quantized "graph tokens" instead of individual nodes.