Search papers, labs, and topics across Lattice.
University of Edinburgh
2
0
4
Pixel-based language models can now handle multiple languages and scripts, achieving strong multilingual performance and robustness without tokenization.
Hybrid Transformer-SSM architectures can match or exceed pure Transformers in data-efficient in-context retrieval, but only if the task doesn't require precise positional reasoning.