Create a Generative Pretrained Transformer model for Metamath
-
I am using python 3.11
-
It is best to use a GPU to train.
-
Run the notebook claim_gpt.ipynb to get started. Refer to its README.
This project is based on the github repo by Andrej Karpathy https://github.com/karpathy/ng-video-lecture
I based my code on gpt.py (which I refactored into multiple py files). That file is 225 lines. I ignored the file bigram.py. That repo uses a Shakespeare corpus to generate Shakespeare like dialog. I kept the model code in gpt.py and just replaced the Shakespeare corpus with a Metamath corpus that I generated from Python.