-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add documention for generation_config #304
Conversation
@@ -487,6 +488,9 @@ Using a trained model, make predictions from the provided dataset. | |||
to use. Possible values are `'full'`, `'training'`, `'validation'`, `'test'`. | |||
- **batch_size** (int, default: 128): size of batch to use when making | |||
predictions. | |||
- **generation_config** (Dict, default: `None`) config for the generation of the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd also just make a mention that this is only used if the model type is LLM, otherwise it is ignored.
Then, I'd also add a link to the docs for generation config (it's in configuration > large language models) indicating that users can see how to configure this parameter in those docs so that users know what kind of parameters they can use.
Wdyt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, there's an equivalent page for CLI based commands with arguments. Can we add this there as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. This page should be generated automatically using code_doc_autogen.py
, which runs every night. I wonder why this page wasn't automatically updated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, here's the PR: #303. In general, if there's documentation we want to add to this page, we should update api.py
directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the update to Ludwig: ludwig-ai/ludwig#3535
docs/user_guide/api/LudwigModel.md
Outdated
- **generation_config** (Dict, default: `None`) config for the generation of the | ||
predictions. If `None`, the config that was used during model training is | ||
used. This is only used if the model type is LLM. Otherwise, this parameter is | ||
ignored. See [Large Language Models](../../configuration/large_language_model.md) under "Generation" for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Heads up: this style of linking (unfortunately) doesn't work when the website is built and deployed by mike
. The best solution I've found is to just use the relative link style (/latest/configuration/large_language_model/
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to replicate and verify, follow the instructions in the Versioned Docs section of the README.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Martin. I've updated the link
Add documentation in the docs for #3520