Search papers, labs, and topics across Lattice.
1
0
4
1
Domain-specific training and aggressive quantization lets a 124M parameter language model outperform larger, general-purpose models on legal retrieval tasks while running entirely offline on consumer CPUs.