Search papers, labs, and topics across Lattice.
1
0
3
17
By applying nonlinearity *after* pairwise similarity calculations, Hadamard Linear Attention achieves a higher-degree rational function approximation of softmax, potentially bridging the gap between efficient linear attention and standard quadratic attention.