Skip to yearly menu bar Skip to main content


Poster

OptCM: The Optimization Consistency Models for Solving Combinatorial Problems in Few Shots

Yang Li · Jinpei Guo · Runzhong Wang · Hongyuan Zha · Junchi Yan

[ ]
Thu 12 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Diffusion models have recently advanced Combinatorial Optimization (CO) as a powerful backbone for neural solvers. However, their iterative sampling process requiring denoising across multiple noise levels incurs substantial overhead. We propose to learn direct mappings from different noise levels to the optimal solution for a given instance, facilitating high-quality generation with minimal shots. This is achieved through an optimization consistency training protocol, which, for a given instance, minimizes the difference among samples originating from varying generative trajectories and time steps relative to the optimal solution. The proposed Optimization Consistency Models (OptCM) enable fast single-step solution generation while retaining the option of multi-step sampling to trade for sampling quality, which offers a more effective and efficient alternative backbone for neural solvers. In addition, to mitigate the gap between training over historical instances and solving for the new instance, we additionally introduce a novel consistency-based gradient search scheme at test stage, for more effective exploration in the training-phase learned solution space. It is achieved by updating the latent solution probabilities under objective gradient guidance during the alternation of noise injection and denoising steps. Extensive experiments on two popular tasks, Traveling Salesman Problem (TSP) and Maximal Independent Set (MIS), demonstrate the superiority of OptCM regarding both solution quality and efficiency, even outperforming LKH given limited time budgets. Notably, OptCM with merely one-step generation and one-step gradient search can mostly outperform the SOTA diffusion-based counterparts that require hundreds of steps, while achieving tens of times speedup.

Live content is unavailable. Log in and register to view live content