Skip to content

Local LLM models?, integration with Oobabooga if it's required #137

@juangea

Description

@juangea

Describe the feature you'd like
To be able to use all this system locally, so we can use local models like Wizard-Vicuna and not having to share our data with OpenAI or other sites or clouds.

Maybe an option to avoid having to do a full local LLM implementation is to make it communicate with Oobabooga with it's API, not sure thought, but I suspect it's similar to talking with ChatGPT.

Will this be implemented at some point?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions