Search papers, labs, and topics across Lattice.
1
0
3
2
Polish language understanding gets a long-context boost: a new encoder model handles sequences up to 8192 tokens, outperforming existing models on long documents while remaining competitive on shorter texts.