Skip to content

Create a Generative Pretrained Transformer model for Metamath

License

Notifications You must be signed in to change notification settings

billh0420/ClaimGPT250203

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ClaimGPT250203

Create a Generative Pretrained Transformer model for Metamath

  • I am using python 3.11

  • It is best to use a GPU to train.

  • Run the notebook claim_gpt.ipynb to get started. Refer to its README.

This project is based on the github repo by Andrej Karpathy https://github.com/karpathy/ng-video-lecture

I based my code on gpt.py (which I refactored into multiple py files). That file is 225 lines. I ignored the file bigram.py. That repo uses a Shakespeare corpus to generate Shakespeare like dialog. I kept the model code in gpt.py and just replaced the Shakespeare corpus with a Metamath corpus that I generated from Python.

About

Create a Generative Pretrained Transformer model for Metamath

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published