Skip to content

Commit

Permalink
update docs of length_penalty (huggingface#17022)
Browse files Browse the repository at this point in the history
  • Loading branch information
manandey authored and elusenji committed Jun 12, 2022
1 parent 74d9e0c commit 0ef1c5c
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/transformers/generation_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -950,9 +950,9 @@ def generate(
eos_token_id (`int`, *optional*):
The id of the *end-of-sequence* token.
length_penalty (`float`, *optional*, defaults to 1.0):
Exponential penalty to the length. 1.0 means no penalty. Set to values < 1.0 in order to encourage the
model to generate shorter sequences, to a value > 1.0 in order to encourage the model to produce longer
sequences.
Exponential penalty to the length. 1.0 means that the beam score is penalized by the sequence length. 0.0 means no penalty. Set to values < 0.0 in order to encourage the
model to generate longer sequences, to a value > 0.0 in order to encourage the model to produce shorter
sequences.
no_repeat_ngram_size (`int`, *optional*, defaults to 0):
If set to int > 0, all ngrams of that size can only occur once.
encoder_no_repeat_ngram_size (`int`, *optional*, defaults to 0):
Expand Down

0 comments on commit 0ef1c5c

Please # to comment.