New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ability to change open api base url #9
Comments
|
Thanks for opening this. I'm not sure it's a good idea to have an option like that yet for general use; I'd like to be able to explain a more normal use-case that needs it first. I'd like to un-block you, though, so I could perhaps create a branch for this, and when if there's a general need for it, I can merge it into the main branch. Would that be OK? |
|
No need it to unblock me, I wrote elisp to get embeddings from my local api and can wait it to be fixed in llama.cpp and ollama upstream. It can be useful with something like https://github.com/oobabooga/text-generation-webui or https://github.com/BerriAI/litellm |
|
I'm also interested in this, since I need to go through a proxy that uses mutual TLS (I also need to specify a client certificate and CLI eventually, but that's another storry) to talk to the OpenAI API. |
|
@r0man I recently received another similar request via another channel - do you need an API key for your use? The other request for proxy functionality was basically to use a proxy but the proxy takes care of authentication. |
|
Hi @ahyatt, yes I would need to use an API key via the Bearer header. That API key is not the one from OpenAI, but one I got assigned by my company. Shouldn't matter though, as long as I can customize it with the the |
|
OK, all, I've pushed a fix for this, which is a new provider, |
|
Thank you @ahyatt. I'm not able to test this at the moment, since I still have an issue with the client certificate I need to use. But it looks good to me. |
Hi.
There are many open source solutions mimicking to open ai api. It will be useful if we can use it with llm library. For now there is no working local embedding solution working with llm. Ollama's and llama.cpp are broken now (fix in progress). I wrote simple one myself, but I can't use it with llm. I can also mimic open ai api and use your library. Since there is already supported vss extension for sqlite in 29.2, it will be very useful.
The text was updated successfully, but these errors were encountered: