Search papers, labs, and topics across Lattice.
The paper introduces IIBalance, a novel multimodal learning framework that addresses modality imbalance by aligning modality contributions with Intrinsic Information Budgets (IIB). They estimate each modality's IIB using a task-grounded estimator, transforming capacity into a global prior over modality contributions. IIBalance then employs a prototype-based relative alignment mechanism and a probabilistic gating module for calibrated fusion weights, outperforming SOTA balancing methods on three benchmarks.
Instead of forcing modalities to imitate each other, IIBalance lets each modality contribute according to its intrinsic information budget, leading to better multimodal fusion.
Multimodal models often converge to a dominant-modality solution, in which a stronger, faster-converging modality overshadows weaker ones. This modality imbalance causes suboptimal performance. Existing methods attempt to balance different modalities by reweighting gradients or losses. However, they overlook the fact that each modality has finite information capacity. In this work, we propose IIBalance, a multimodal learning framework that aligns the modality contributions with Intrinsic Information Budgets (IIB). We propose a task-grounded estimator of each modality's IIB, transforming its capacity into a global prior over modality contributions. Anchored by the highest-budget modality, we design a prototype-based relative alignment mechanism that corrects semantic drift only when weaker modalities deviate from their budgeted potential, rather than forcing imitation. During inference, we propose a probabilistic gating module that integrates the global budgets with sample-level uncertainty to generate calibrated fusion weights. Experiments on three representative benchmarks demonstrate that IIBalance consistently outperforms state-of-the-art balancing methods and achieves better utilization of complementary modality cues. Our code is available at: https://github.com/XiongZechang/IIBalance.