Search papers, labs, and topics across Lattice.
1
0
3
2
Forget attention's quadratic scaling: Prototype Transformer offers linear complexity and interpretable concept-based reasoning, rivaling SOTA performance.