Skip to content

Commit

Permalink
Merge pull request #235 from EricFillion/ef/fix-gen-setting-table
Browse files Browse the repository at this point in the history
Fixed Text Generation Setting Table
  • Loading branch information
EricFillion committed Jun 12, 2021
2 parents 5c64293 + 7507e6f commit 89202b2
Showing 1 changed file with 13 additions and 11 deletions.
24 changes: 13 additions & 11 deletions docs/pages/1-text-generation/3-settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,17 +24,19 @@ GENSettings() contains the fields shown in Table 1.0

#### Table 1.0:

| Parameter |Default| Definition |
|----------------------|-------| ----------------------------------------------------------------------|
| min_length | 10 | Minimum number of generated tokens |
| max_length | 50 | Maximum number of generated tokens |
| do_sample | False | When True, picks words based on their conditional probability |
| early_stopping | False | When True, generation finishes if the EOS token is reached |
| num_beams | 1 | Number of steps for each search path |
| temperature | 1.0 | How sensitive the algorithm is to selecting low probability options |
| top_k | 50 | How many potential answers are considered when performing sampling |
| top_p | 1.0 | The number minimum number of tokens are selected so that their cumulative probabilities add up to top_p |
| no_repeat_ngram_size | 0 | The size of an n-gram that cannot occur more than once. (0=infinity) |
| Parameter |Default| Definition |
|----------------------|-------|----------------------------------------------------------------------------|
| min_length | 10 | Minimum number of generated tokens |
| max_length | 50 | Maximum number of generated tokens |
| do_sample | False | When True, picks words based on their conditional probability |
| early_stopping | False | When True, generation finishes if the EOS token is reached |
| num_beams | 1 | Number of steps for each search path |
| temperature | 1.0 | How sensitive the algorithm is to selecting low probability options |
| top_k | 50 | How many potential answers are considered when performing sampling |
| top_p | 1.0 | Min number of tokens are selected where their probabilities add up to top_p|
| no_repeat_ngram_size | 0 | The size of an n-gram that cannot occur more than once. (0=infinity) |


#### Examples 1.2:

```python
Expand Down

0 comments on commit 89202b2

Please sign in to comment.