Search papers, labs, and topics across Lattice.
Beijing University of Posts and Telecommunications
3
0
5
Forget scraping private databases: RDB-PFN shows you can pre-train a relational foundation model from scratch using 2 million synthetically generated relational databases and achieve strong few-shot performance.
GTokenLLMs suffer from a text-dominant bias, but RGLM offers a way to fix this by reconstructing graph information directly from the LLM's graph token outputs.
Domain-specific knowledge hypergraphs can now be extracted with significantly improved quality by dynamically learning and applying extraction skills, outperforming static few-shot learning.