Skip to content

I use LM-studio a local AI with openAI endpoints: #1089

Open
@ludiusvox

Description

@ludiusvox

Feature summary

Using local LM-studio Llama 3.2 system for LLM on system

Feature description

I use LM-studio a local AI with openAI endpoints It has an openAI API, I can run a local server of an LLM, to apply for jobs, and then take that system, and then incorporate a local machines endpoint instead of OpenAI.

This would cost me no money to use, and I wouldn't need Gemini or ChatGPT pro. It would save me money. I will look at it myself also since this is open source.

Motivation

No response

Alternatives considered

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions