Contributed Talk
in
Workshop: Math AI for Education (MATHAI4ED): Bridging the Gap Between Research and Smart Education
Towards Grounded Natural Language Proof Generation
Jiacheng Liu
Abstract:
When a student is working on a mathematical proof, it is often helpful to receive suggestions about how to proceed. To this end, we provide an initial study of two generation tasks in natural mathematical language: suggesting the next step in a proof, and full-proof generation. As proofs are grounded in past results- e.g. theorems, definitions- we study knowledge-grounded generation methods, and find that conditioning on retrieved or ground-truth knowledge greatly improves generations. We characterize error types and provide directions for future research.