Search papers, labs, and topics across Lattice.
The paper introduces UniHetCO, a unified heterogeneous graph representation for unsupervised neural combinatorial optimization (NCO) that encodes problem structure, objective terms, and linear constraints. This enables training a single model across multiple constrained quadratic programming problems with a unified label-free objective, unlike previous methods that are problem-specific. Experiments show UniHetCO achieves competitive performance, strong cross-problem adaptation, and effective warm starts for classical solvers.
Train one NCO model to solve multiple NP-hard graph problems, outperforming specialized models and even boosting classical solvers.
Unsupervised neural combinatorial optimization (NCO) offers an appealing alternative to supervised approaches by training learning-based solvers without ground-truth solutions, directly minimizing instance objectives and constraint violations. Yet for graph node subset-selection problems (e.g., Maximum Clique and Maximum Independent Set), existing unsupervised methods are typically specialized to a single problem class and rely on problem-specific surrogate losses, which hinders learning across classes within a unified framework. In this work, we propose UniHetCO, a unified heterogeneous graph representation for constrained quadratic programming-based combinatorial optimization that encodes problem structure, objective terms, and linear constraints in a single input. This formulation enables training a single model across multiple problem classes with a unified label-free objective. To improve stability under multi-problem learning, we employ a gradient-norm-based dynamic weighting scheme that alleviates gradient imbalance among classes. Experiments on multiple datasets and four constrained problem classes demonstrate competitive performance with state-of-the-art unsupervised NCO baselines, strong cross-problem adaptation potential, and effective warm starts for a commercial classical solver under tight time limits.