Search papers, labs, and topics across Lattice.
1
0
3
Train a competitive 2B MoE LLM on 16 commodity GPUs connected via the internet, proving you don't need a datacenter to play the LLM game.