Skip to yearly menu bar Skip to main content


Poster
in
Workshop: MATH-AI: The 4th Workshop on Mathematical Reasoning and AI

Math2Sym: A System for Solving Elementary Problems via Large Language Models and Symbolic Solvers

Nguyen Phu · Phuong Pham · Man Ngo · Tuan Minh Kha

Keywords: [ Formalization ] [ Math Word Problems ] [ Large language models ] [ Symbolic ]


Abstract:

Traditional models for solving math word problems (MWPs) often struggle to capture both linguistic context and arithmetic reasoning. We propose Math2Sym, a novel approach integrating large language models (LLMs) with symbolic solvers. This method leverages LLMs' language comprehension and symbolic computation's precision to efficiently convert MWPs into solvable symbolic form. We introduce the EMSF dataset for training models to formalize math problems across various complexities. On our defined test set benchmark, fine-tuned models outperform GPT-3.5 by 17% in few-shot tasks and perform comparably to GPT-4-mini on elementary math problems.

Chat is not available.