Search papers, labs, and topics across Lattice.
B-Think Polygl0t/Tucano2-qwen-3.
2
0
4
Forget massive multilingual models: LilMoo proves a carefully trained 0.6B Hindi model can beat them at their own game.
Open-source Portuguese LLMs just got a major upgrade: Tucano 2 models outperform existing options thanks to a new recipe of curated and synthetic data, plus targeted post-training for RAG and tool use.