Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models
Drag-guided diffusion models for vehicle image generation
Nikos Arechiga · Frank Permenter · Binyang Song · Chenyang Yuan
Denoising diffusion models trained at web-scale have revolutionized image generation. The application of these tools to engineering design holds promising potential but is currently limited by their inability to understand and adhere to concrete engineering constraints. In this paper, we take a step toward the goal of incorporating quantitative constraints into diffusion models by proposing physics-based guidance, which enables the optimization of a performance metric (as predicted by a surrogate model) during the generation process. As a proof-of-concept, we add drag guidance to Stable Diffusion, which allows this tool to generate images of novel vehicles while simultaneously minimizing their predicted drag coefficients.