-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Description
Describe the bug
I can fine-tune gpt-3.5-turbo
just fine, but it looks like there's no way to change the number of epochs to something over than 10, even though the Guides recommend indicate that you can ("If the model becomes less diverse than expected decrease the number by 1 or 2 epochs"
). It seems to be over training quite a bit. Past experience is that most LLMs just need 1-2 epochs.
There doesn't seem to be an actual argument that can be passed to FineTune.create
to change the number of training epochs. I've fine-tuned davinic-003
before with the parameter n_epochs
, but it doesn't seem to work with ChatGPT fine-tuning. There's no reference in help(FineTune.create)
to epochs and passing epochs
, n_epochs
to FineTune.create
yields extra fields not permitted
.
Running it through openai api fine_tunes.create
also doesn't permit specification of epochs. Am I missing something?
To Reproduce
- Format and upload a dataset
- Create a training job using the Python SDK with
FineTuningJob.create
passing an argument to change the number of epochs
Code snippets
No response
OS
Ubuntu 20.04
Python version
Python 3.7.5
Library version
openai-python 0.27.9