Search papers, labs, and topics across Lattice.
2
0
5
Steer LLM attention without the memory bottleneck: SEKA unlocks prompt highlighting and other control capabilities even with FlashAttention.
Diffusion language models get a nearly 20% boost on MBPP by strategically planning the order in which they fill in the blanks.