Head to the 'Wiki' tab to see more infor on the project!
- G: Setup GPT2-small for finetuning
- D:
Function to grab random doc from wiki pages (Function S) - D: Function to prep file for GPT-2 submission; Read this article on byte-pair word tokenizing.
Train on 1/x portion of docG: Function to eval perplexity- G: Eval continue vs. stop for a given txt (Function F)
- D: function to jump back to either 4 & 5 or 2 (Function S)
D: Function to log visited files,perplexity levels