Search papers, labs, and topics across Lattice.
New York University
2
0
4
LLMs struggle to translate even simple formal languages in-context when faced with increasing grammar size, sentence length, and cross-linguistic differences, revealing limitations in their ability to effectively utilize provided grammatical descriptions.
BabyLM 2026 seeks to push the boundaries of data-efficient and cognitively plausible language models, now with a multilingual twist.