This project is an implementation of the GPT language model using the "Attention is All You Need" paper as a guide, with inspiration from OpenAI's GPT-2 and GPT-3 models. We use Shakespeare as the dataset for training and fine-tuning the model.
forked from VanekPetr/my-own-GPT
-
Notifications
You must be signed in to change notification settings - Fork 0
A simple PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) language model from scratch.
License
ItsJohhny/my-own-GPT
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A simple PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) language model from scratch.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Python 100.0%