Exploring Mathematical Conjecturing with Large Language Models
Paper i proceeding, 2023

The task of automating the discovery of mathematical conjectures has so far primarily been addressed in symbolic systems. However, a neuro-symbolic architecture seems like an excellent fit for this task. We can assign the generative task to a neural system without much risk, even if a few non-theorems slip through, the results are checked afterwards using a symbolic theorem prover or counter-example finder. In this initial case-study, we investigate the capabilities of GPT-3.5 and GPT-4 on this task. While results are mixed, we see potential in improving on the weaknesses of purely symbolic systems. A neuro-symbolic theory exploration system could, for instance, add some more variation in conjectures over purely symbolic systems while not missing obvious candidates.

Conjecturing

Large Language Models

Theory Exploration

Författare

Moa Johansson

Chalmers, Data- och informationsteknik, Data Science och AI

Nicholas Smallbone

Chalmers, Data- och informationsteknik, Funktionell programmering

CEUR Workshop Proceedings

16130073 (ISSN)

Vol. 3432 62-77

17th International Workshop on Neural-Symbolic Learning and Reasoning, NeSy 2023
Siena, Italy,

Ämneskategorier

Programvaruteknik

Datavetenskap (datalogi)

Mer information

Senast uppdaterat

2023-08-24