Skip to content

Simplify createChatCompletion #666

@kamath

Description

@kamath

Right now, you can theoretically use the stagehand.llmClient object to access the underlying LLM for Stagehand.

This can be helpful for times when you want to ask the LLM custom questions that can guide your next step -- for example, if you're writing a Stagehand script, you might want to ask the LLM for what word to input next. This is possible right now, but it's really cumbersome:

Image

It would be awesome to have stagehand.llmClient.generateText() and stagehand.llmClient.generateObject() like the Vercel AI SDK. Bonus points if you add streamText :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions