Search papers, labs, and topics across Lattice.
This paper introduces an entity-centric medical data engineering framework that constructs a Medical Entity Tree (MET) from medical literature to encode hierarchical relationships between diseases, anatomy, modalities, and symptoms. The MET is then used to guide data retrieval, filtering/alignment, and knowledge-aware data synthesis for training MLLMs. Experiments across six medical benchmarks show significant performance gains in MLLM capabilities for complex clinical queries, achieving state-of-the-art results.
Fragmented medical data hurts MLLM performance: this paper shows how a hierarchical medical knowledge graph can be used to engineer training data that substantially improves MLLM accuracy on complex clinical tasks.
Multimodal Large Language Models (MLLMs) have shown transformative potential in medical applications, yet their performance is hindered by conventional data curation strategies that rely on coarse-grained partitioning by modality or department. Such fragmented approaches fail to capture the hierarchical and interconnected nature of clinical medical knowledge, limiting the models'ability to perform fine-grained recognition and complex reasoning. In this paper, we propose a novel Entity-Centric Medical Data Engineering framework. We automatically extract entities from authoritative medical literature to construct a Medical Entity Tree (MET), a hierarchical structure that systematically encodes diseases, anatomical structures, modalities, and symptoms into a unified knowledge repository. Building upon the MET, we propose an advanced data engine that includes: (1) node-guided retrieval to anchor raw data to specific medical concepts, (2) a two-stage hybrid filtering and alignment pipeline to ensure precise visual-semantic correspondence, and (3) knowledge-aware data synthesis to generate enriched captions and targeted reasoning VQA pairs, leveraging structural constraints. Extensive evaluations across six medical benchmarks demonstrate that our approach significantly enhances the medical capabilities of general-purpose MLLMs, improving their ability to handle complex clinical queries and achieve state-of-the-art performance in diverse medical contexts.