Search papers, labs, and topics across Lattice.
This paper introduces FOCAL-Attention, a novel attention mechanism for multi-label node classification on heterogeneous graphs that addresses semantic dilution and coverage constraint issues. FOCAL combines coverage-oriented attention (COA) for flexible context aggregation with anchoring-oriented attention (AOA) to focus on meta-path-induced primary semantics. Experiments demonstrate that FOCAL outperforms state-of-the-art methods by resolving the coverage-anchoring conflict inherent in heterogeneous graph learning.
FOCAL-Attention resolves the inherent coverage-anchoring conflict in heterogeneous graph learning, outperforming existing methods in multi-label node classification.
Heterogeneous graphs have attracted increasing attention for modeling multi-typed entities and relations in complex real-world systems. Multi-label node classification on heterogeneous graphs is challenging due to structural heterogeneity and the need to learn shared representations across multiple labels. Existing methods typically adopt either flexible attention mechanisms or meta-path constrained anchoring, but in heterogeneous multi-label prediction they often suffer from semantic dilution or coverage constraint. Both issues are further amplified under multi-label supervision. We present a theoretical analysis showing that as heterogeneous neighborhoods expand, the attention mass allocated to task-critical (primary) neighborhoods diminishes, and that meta-path constrained aggregation exhibits a dilemma: too few meta-paths intensify coverage constraint, while too many re-introduce dilution. To resolve this coverage-anchoring conflict, we propose FOCAL: Fusion Of Coverage and Anchoring Learning, with two components: coverage-oriented attention (COA) for flexible, unconstrained heterogeneous context aggregation, and anchoring-oriented attention (AOA) that restricts aggregation to meta-path-induced primary semantics. Our theoretical analysis and experimental results further indicates that FOCAL has a better performance than other state-of-the-art methods.