Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Graph Neural Networks for Hyperparameter Inference in Ising Solvers

Edward Jiang · Timothee Leleu · Sam Reifenstein · Milin Doppalapudi


Abstract:

We propose a novel method to apply graph neural networks (GNNs) to combinatorial optimization problems. Unlike existing approaches that use GNNs to directly solve problem instances, our method instead uses them to predict hyperparameters for a heuristic solver. The model is trained in a supervised fashion on a small dataset of graphs, with corresponding hyperparameters obtained through conventional hyperparameter optimization routines. During inference, the model predicts near-optimal hyperparameters for unseen instances that minimize the runtime of the heuristic solver. Experiments show that our method generalizes well to much larger graphs, and outperforms manually hand-tuned parameters. The framework is flexible and can be applied to a wide variety of combinatorial optimization problems or heuristic solvers.

Chat is not available.