Poster
in
Workshop: AI4Mat-2024: NeurIPS 2024 Workshop on AI for Accelerated Materials Design
Scaling autoregressive models for lattice thermodynamics
Xiaochen Du · Sulin Liu · Rafael Gomez-Bombarelli
Keywords: [ lattices ] [ in-painting ] [ alloys ] [ thermodynamics ] [ Ising Model ] [ solid state ] [ marginalization models ] [ Transformer ] [ autoregressive models ] [ generative models ]
Understanding the thermodynamics of solid state, crystalline materials is important for applications ranging from catalysis to electronics. Traditional sampling methods in the discrete spaces represented by periodic lattices are generally Markov-chain Monte Carlo-based, with limitations in speed and scalability. Autoregressive methods, used in text and image generation, have been adapted for sampling lattice thermodynamics. However, these methods rely on a fixed generation order and do not possess fast arbitrary marginal likelihood evaluation capabilities, making scaling lattice sizes challenging. Here, we develop and combine marginalization models with any-order inference and show the resulting models enable larger lattice generation in two ways: scaling training to larger lattice sizes and allowing for in-/out-painting from models trained on smaller lattices. We demonstrate our method using the Ising model and CuAu alloy.