Skip to yearly menu bar Skip to main content


Poster

Neural Model Checking

Mirco Giacobbe · Daniel Kroening · Abhinandan Pal · Michael Tautschnig

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

We introduce a machine learning approach to model checking hardware designs. Model checking answers the question of whether every execution of a given system satisfies a desired temporal logic specification. Unlike testing, model checking provides formal guarantees. Its application is expected standard in silicon design and the EDA industry has invested decades into the development of performant symbolic model checking algorithms. Our new approach combines machine learning and symbolic reasoning by using neural networks as formal proof certificates for linear temporal logic. We train our neural certificates from randomly generated executions of the system and we then symbolically check their validity using satisfiability solving which, upon the affirmative answer, establishes that the system provably satisfies the specification. We leverage the expressive power of neural networks to represent proof certificates as well as the fact that checking a certificate is much simpler than finding one. As a result, our machine learning procedure for model checking is entirely unsupervised, formally sound, and practically effective. We implemented a prototype and compared the performance of our method with the state-of-the-art academic and commercial model checkers on a set of standard hardware designs written in SystemVerilog. In most of our experiments, our method demonstrates superior scalability compared to the academic tools, while performing comparably to the leading commercial tool in nearly half of the cases.

Live content is unavailable. Log in and register to view live content