Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add documention for generation_config #304

Merged
merged 3 commits into from
Sep 18, 2023

Conversation

Infernaught
Copy link
Collaborator

Add documentation in the docs for #3520

@@ -487,6 +488,9 @@ Using a trained model, make predictions from the provided dataset.
to use. Possible values are `'full'`, `'training'`, `'validation'`, `'test'`.
- **batch_size** (int, default: 128): size of batch to use when making
predictions.
- **generation_config** (Dict, default: `None`) config for the generation of the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd also just make a mention that this is only used if the model type is LLM, otherwise it is ignored.

Then, I'd also add a link to the docs for generation config (it's in configuration > large language models) indicating that users can see how to configure this parameter in those docs so that users know what kind of parameters they can use.

Wdyt?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, there's an equivalent page for CLI based commands with arguments. Can we add this there as well?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On it

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. This page should be generated automatically using code_doc_autogen.py, which runs every night. I wonder why this page wasn't automatically updated.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, here's the PR: #303. In general, if there's documentation we want to add to this page, we should update api.py directly.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the update to Ludwig: ludwig-ai/ludwig#3535

- **generation_config** (Dict, default: `None`) config for the generation of the
predictions. If `None`, the config that was used during model training is
used. This is only used if the model type is LLM. Otherwise, this parameter is
ignored. See [Large Language Models](../../configuration/large_language_model.md) under "Generation" for
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Heads up: this style of linking (unfortunately) doesn't work when the website is built and deployed by mike. The best solution I've found is to just use the relative link style (/latest/configuration/large_language_model/).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want to replicate and verify, follow the instructions in the Versioned Docs section of the README.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Martin. I've updated the link

@Infernaught Infernaught merged commit 8be30ed into ludwig-ai:master Sep 18, 2023
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants