-
Notifications
You must be signed in to change notification settings - Fork 736
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Passing global context into tools called by the runTools helper #597
Comments
Could you use a closure for this? async function callFunctions() {
const context = {};
function updateEvent(args: ArgsFromOpenAi) {
const { eventId } = context;
const event = await fetchEvent(eventId);
}
await client.beta.messages.runTools({ tools: [{type: 'function', function: { function: updateEvent }}]})
} |
I considered a closure like you wrote, but it would require merging tools defined across multiple files into a single (very) large file. I'm currently leveraging an inherited class to provide the closure, but running into some typing limitations with it that are worked around with casting and use of LMK if I'm missing a simpler solution 🙏 |
Got it, that makes sense. Can you share more examples of what you've
implemented or what you'd want this to look like / how you'd want to use
it? E.g., where would you want to store context, and how would you expect
to update it?
…On Thu, Dec 21 2023 at 1:59 PM, Alan Yang ***@***.***> wrote:
I considered a closure like you wrote, but it would require merging tools
defined across multiple files into a single (very) large file. I'm
currently leveraging a inherited class to provide the closure, but running
into some typing limitations with it that are worked around with casting
and use of any.
LMK if I'm missing a simpler solution 🙏
—
Reply to this email directly, view it on GitHub
<#597 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFL6LQQ2BJXR2BZF4XHVUDYKSBJVAVCNFSM6AAAAABA6WA6GGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRWG44TIOJTGE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
I'm imagining something that matches the lifecycle of runTools. The context should stay the same throughout that run. It is probably simplest to pass down a context through the
My implementation is kind of a hack that I worked up after realizing that |
Thanks. Could you provide a more complete code sample of what you're trying to do / how you're trying to use this? Including how you update and reference the context? |
Have you tried using // in one file
const updateEvent = (context: Context) => async function updateEvent(args: ArgsFromOpenAI) {
const { eventId } = context;
const event = await fetchEvent(eventId);
}
// in another
const context = {…};
await client.beta.messages.runTools({ tools: [{
type: 'function', function: {
function: updateEvent(context),
name: 'updateEvent'
}
}]}) |
That pattern works! Thanks for the suggestion. The caveat is that the tool definition would have to be managed within the scope of the context which requires a good bit of refactor for me. You can close this issue if you think it's best that providing context not be built into the library! |
Thanks! Hmm, it might be optimal, but I'd like to provide the best possible experience. Would you be willing to share a more complete code sample of what you'd ideally like to see, including how you update & read from context? |
Sorry for the late response-- I've been pushing to get the feature launched and left this as tech debt. I was able to circle back to clean it up. Here's how my implementation looks with the function closure: // types.ts
interface ToolContext {
eventId: string
}
// eventManagerTools.ts
const updateEvent = (context: ToolContext) =>
async function updateEvent(eventDetails: UpdateEventArgs) {
const { eventId } = context;
...
};
export const eventManagerTools: Record<
EventManagerToolNames,
LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
[EventManagerToolNames.UPDATE_EVENT]: {
name: EventManagerToolNames.UPDATE_EVENT,
description: 'Updates event given one or more event details from customer. Only call when values have changed',
function: updateEvent,
parse: JSON.parse,
parameters: {
type: 'object',
properties: {
maxBudgetPerGuest: {
type: 'number',
description:
'Sets maximum budget guest. This should only include numeric values. If math is required, think through it and provide the output',
},
numDays: {
type: 'number',
description: 'Duration (in days) of the event',
},
...
},
},
},
};
// llmFacade.ts
export type LLMFunctionWithContext<Args extends object | string> = Omit<RunnableFunction<Args>, 'function'> & {
function: (context: BoompopToolContext) => RunnableFunction<Args>['function'];
};
export function toTools(llmFunctions: LLMFunctionWithContext<any>[], context: BoompopToolContext) {
// helper to convert to function-like definitions to tools
return llmFunctions.map(
(llmFunction) =>
({
type: 'function',
function: {
...llmFunction,
function: llmFunction.function(context),
},
})
);
}
export async function completionStreamWithTools(systemPrompt: string, tools: RunnableToolFunction<any>[]) {
// simplified as an example
const runner = ChatCompletionStreamingRunner.runTools(openai.chat.completions, {
messages,
model,
tools,
temperature,
stream: true,
});
}
// llmOrchestrator.ts
async function orchestrateResponse() {
const agent = {
tools: [...eventManagerTools]
}
// pass context available scoped to this single stream call
await completionStreamWithTools('Plan an event', toTools(agent.tools, { eventId }))
} This feels fairly good. The only caveat is I have to override and maintain my own type and wrapper to convert to the function type that It would be great to simplify the above by being able to simply change export async function completionStreamWithTools(systemPrompt: string, tools: RunnableToolFunction<any>[], globalToolContext: ToolContext) Then, I would call runTools like this: const runner = ChatCompletionStreamingRunner.runTools(openai.chat.completions, {
messages,
model,
tools,
toolContext: globalToolContext,
temperature,
stream: true,
}); I imagine that the tool context would then be provided to each tool with something like this: updateEvent({ ... }: Args, runner: Runner, toolContext: ToolContext) Alternatively, the tool context could be destructured into Args, but that might be more complicated than it is worth. |
Interesting. Thank you very much for sharing, this is quite helpful. The What do you think about something like this, so you don't have to subclass or write export const eventManagerTools: Record<
EventManagerToolNames,
LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
[EventManagerToolNames.UPDATE_EVENT]: new RunnableFunction({
description: 'Updates event given one or more event details from customer. Only call when values have changed',
function: updateEvent(context),
// …
const runner = openai.beta.chat.completions.runTools({
messages,
model,
tools,
//… |
Oh, that's a nice suggestion! Though, I think that might still lead to folks wanting to DRY up the e.g. to DRY up this export const eventManagerTools: Record<
EventManagerToolNames,
LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
[EventManagerToolNames.TOOL_A]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_B]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_C]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_D]: new RunnableFunction(...),
...
} |
Confirm this is a feature request for the Node library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
I currently have a pattern where I need to pass context to my tools to allow them to act on my app. For example:
It'd be great if there were some way to pass a global context to the runner since the runner is passed into each function call. Then, I could do something like this:
Additional context
A workaround is to build my own runner that leverages the existing helpers. However, this is complicated because of the types integration.
The text was updated successfully, but these errors were encountered: