Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement cancellation and timeout for all LLMs, allow passing LLM call options via LLMChain (like was already possible for stop) #1182

Merged
merged 10 commits into from
May 11, 2023

Conversation

nfcampos
Copy link
Collaborator

@nfcampos nfcampos commented May 9, 2023

No description provided.

…ll options via LLMChain (like was already possible for stop)
@vercel
Copy link

vercel bot commented May 9, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
langchainjs-docs ✅ Ready (Inspect) Visit Preview May 11, 2023 10:22am

* Wraps _call and handles memory.
*/
call(
values: ChainValues & this["llm"]["CallOptions"],
Copy link
Collaborator

@jacoblee93 jacoblee93 May 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My ideal signature here (when v1 comes around?) would be:

call(
  values: ChainValues
  options: this["llm"]["CallOptions"] // Includes "callbacks?"
)

So as not to have any restricted/meta properties as I'm sure we'll keep adding more and more. Would that make sense long-term?

Otherwise, if this is the long-term solution, could it make sense to make just options or callOptions (and I guess stop) a special value within the values param here?

call(
  values: ChainValues & {
    options: this["llm"]["CallOptions"]
  }
)

I see you have something like this below for openai-chat but building it in lower level might make sense to me if this is the long-term fix.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it could make sense to add callbacks in call options

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, the thing is only language models have call options, whereas lots of other objects in the library have callbacks, so not sure about that

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think overall we need a better solution for passing args to specific steps of a chain (eg. in an LLMChain the prompt is the first step and the LLM is the 2nd step (and the output parser is the 3rd step)). Not sure what the right approach is just yet

@@ -229,20 +237,20 @@ export abstract class LLM extends BaseLLM {
*/
abstract _call(
prompt: string,
stop?: string[] | this["CallOptions"],
options?: Omit<this["CallOptions"], "timeout">,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wait call doesn't actually call _call?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is a bit confusing:
For LLMs, the main entrypoint is generate, which calls eventually calls _generate (which is abstract). call is a convenience method in BaseLLM that calls generate when there is a single input/output expected.

To make it easier for people to implement their own LLMs, there is an simpler LLM base class LLM where users have to implement a single _call method (single input/output) that _generate calls.

@jacoblee93
Copy link
Collaborator

jacoblee93 commented May 11, 2023

Hey @nfcampos, implemented several of the comments and changes I discussed here: jacoblee93@953f848

I didn't want to just directly merge into your branch before you looked at them.

I added a new declared ParsedCallOptions: Omit<this["CallOptions"], "timeout">; type so we could avoid using Omit everywhere, and implemented/tested things with chat models. I also made a lot of the internal model methods (e.g. _generate and _call) take required option params since they're only ever called by their parent method and we can guarantee that the value will be populated.

Let me know what you think!

docs/docs/modules/models/llms/additional_functionality.mdx Outdated Show resolved Hide resolved
@@ -229,20 +237,20 @@ export abstract class LLM extends BaseLLM {
*/
abstract _call(
prompt: string,
stop?: string[] | this["CallOptions"],
options?: Omit<this["CallOptions"], "timeout">,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is a bit confusing:
For LLMs, the main entrypoint is generate, which calls eventually calls _generate (which is abstract). call is a convenience method in BaseLLM that calls generate when there is a single input/output expected.

To make it easier for people to implement their own LLMs, there is an simpler LLM base class LLM where users have to implement a single _call method (single input/output) that _generate calls.

* Wraps _call and handles memory.
*/
call(
values: ChainValues & this["llm"]["CallOptions"],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it could make sense to add callbacks in call options

Copy link
Collaborator

@agola11 agola11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall looks good

@nfcampos nfcampos added the lgtm PRs that are ready to be merged as-is label May 11, 2023
@nfcampos nfcampos merged commit 7fa9181 into main May 11, 2023
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm PRs that are ready to be merged as-is
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants