Search papers, labs, and topics across Lattice.
3
0
5
Skip the expensive proxy model training: this training-free method boosts VLLM performance by up to 4.8% using only 10-15% of the data, simply by measuring how much the question *changes* the model's view of the answer.
Forget scraping private databases: RDB-PFN shows you can pre-train a relational foundation model from scratch using 2 million synthetically generated relational databases and achieve strong few-shot performance.
Generating realistic tabular data with both numbers and free-form text just got easier: TabDLM bridges the gap between diffusion models and LLMs for superior joint modeling.