Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow different LLM objects for each PromptTemplate in SequentialChain #231

Closed
sjwhitmore opened this issue Nov 30, 2022 · 2 comments
Closed
Assignees

Comments

@sjwhitmore
Copy link
Contributor

I would like to use SequentialChain with the option to use a different LLM class at each step. The rationale behind this is that I am using different temperature settings for different prompts within my chain. I also potentially may use different models for each step in the future.

a rough idea for config -- have a json dict specifying LLM config, and pass in a list of configs (or list of LLM objects) which is the same length as the number of prompttemplates in the chain if you want to use different objects per chain, or one LLM object or config object in the case where you want to use the same for all

@hwchase17
Copy link
Contributor

@sjwhitmore is this already in there? or is more work needed?

@sjwhitmore
Copy link
Contributor Author

sjwhitmore commented Dec 1, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants