Exploring Mathematical Conjecturing with Large Language Models
Paper in proceeding, 2023

The task of automating the discovery of mathematical conjectures has so far primarily been addressed in symbolic systems. However, a neuro-symbolic architecture seems like an excellent fit for this task. We can assign the generative task to a neural system without much risk, even if a few non-theorems slip through, the results are checked afterwards using a symbolic theorem prover or counter-example finder. In this initial case-study, we investigate the capabilities of GPT-3.5 and GPT-4 on this task. While results are mixed, we see potential in improving on the weaknesses of purely symbolic systems. A neuro-symbolic theory exploration system could, for instance, add some more variation in conjectures over purely symbolic systems while not missing obvious candidates.

Conjecturing

Large Language Models

Theory Exploration

Author

Moa Johansson

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Nicholas Smallbone

Chalmers, Computer Science and Engineering (Chalmers), Functional Programming

CEUR Workshop Proceedings

16130073 (ISSN)

Vol. 3432 62-77

17th International Workshop on Neural-Symbolic Learning and Reasoning, NeSy 2023
Siena, Italy,

Subject Categories

Software Engineering

Computer Science

More information

Latest update

8/24/2023