-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Description
Describe the bug
The documentation says
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
However, I can pass values like 2.3 to this parameter and the API doesn't raise anything. So, I suppose this is a bug with the API or the documentation is wrong. Note: this doesn't happen with the presence_penalty
parameter, which, if set to e.g. 2.2, makes the API returns an error openai.error.InvalidRequestError: 2.2 is greater than the maximum of 2 - 'presence_penalty'
. Even if I set say frequency_penalty=10000
, the api doesn't err. Similarly, the API doesn't err if we set this parameter to negative values like -1000
. So, maybe this parameter is supposed to take any floating-point number and the documentation is wrong? Or maybe the API is buggy?
To Reproduce
You can make any call to the completions endpoint with any valid value for the other parameters and you will not get an error you're supposed to get
Code snippets
import openai
# Set your api key
completions = openai.Completion.create(model="text-davinci-003",
prompt="hello",
frequency_penalty=1000)
print(completions)
OS
macos monterey (12.5.1)
Python version
Python 3.8.13
Library version
0.27.0