Open
Description
Feature summary
Using local LM-studio Llama 3.2 system for LLM on system
Feature description
I use LM-studio a local AI with openAI endpoints It has an openAI API, I can run a local server of an LLM, to apply for jobs, and then take that system, and then incorporate a local machines endpoint instead of OpenAI.
This would cost me no money to use, and I wouldn't need Gemini or ChatGPT pro. It would save me money. I will look at it myself also since this is open source.
Motivation
No response
Alternatives considered
No response
Additional context
No response