Search papers, labs, and topics across Lattice.
This paper introduces HeteroFusion, a novel approach to merge heterogeneous language models built on different architectures like Llama, Qwen, and Mistral. HeteroFusion uses topology-based alignment to transfer knowledge by matching functional module structures and conflict-aware denoising to suppress incompatible signals. Experiments across various settings demonstrate that HeteroFusion outperforms existing merging, fusion, and ensemble baselines.
Forget painstakingly training a single massive model – HeteroFusion lets you Frankenstein a super-model by intelligently merging the strengths of Llama, Qwen, and Mistral.
Model merging aims to integrate multiple expert models into a single model that inherits their complementary strengths without incurring the inference-time cost of ensembling. Recent progress has shown that merging can be highly effective when all source models are \emph{homogeneous}, i.e., derived from the same pretrained backbone and therefore share aligned parameter coordinates or compatible task vectors. Yet this assumption is increasingly unrealistic in open model ecosystems, where useful experts are often built on different families such as Llama, Qwen, and Mistral. In such \emph{heterogeneous} settings, direct weight-space fusion becomes ill-posed due to architectural mismatch, latent basis misalignment, and amplified cross-source conflict. We address this problem with \texttt{HeteroFusion} for heterogeneous language model fusion, which consists of two key components: topology-based alignment that transfers knowledge across heterogeneous backbones by matching functional module structures instead of raw tensor coordinates, and conflict-aware denoising that suppresses incompatible or noisy transfer signals during fusion. We further provide analytical justification showing that preserving the target adapter basis while predicting structured updates leads to a stable and well-conditioned transfer process. Across heterogeneous transfer, multi-source fusion, noisy-source robustness, and cross-family generalization settings, \texttt{HeteroFusion} consistently outperforms strong merging, fusion, and ensemble baselines.