Skip to content

openai/gpt-2 #379

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

openai/gpt-2 #379

wants to merge 2 commits into from

Conversation

Ritzman111
Copy link

The original name for GPT-2, as released by OpenAI, was simply "GPT-2". It did not have a different name prior to its release.

The description provided by OpenAI for GPT-2 was as follows
:OpenAI developed several variants of the GPT-2 model with different numbers of parameters. The main versions of GPT-2 released by OpenAI include:

  1. GPT-2 Small: 124 million parameters
  2. GPT-2 Medium: 355 million parameters
  3. GPT-2 Large: 774 million parameters
  4. GPT-2 Extra Large (XL): 1.5 billion parameters

GPT-2, which stands for "Generative Pre-trained Transformer 2", is a large-scale unsupervised language model that generates human-like text. It was trained on a dataset of 8 million web pages and is capable of generating coherent and contextually relevant sentences from a given prompt. GPT-2 is the successor to the original GPT model and features a more extensive architecture with 1.5 billion parameters, making it one of the largest and most powerful language models at the time of its release.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant