Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to change open api base url #9

Closed
s-kostyaev opened this issue Nov 30, 2023 · 7 comments
Closed

Add ability to change open api base url #9

s-kostyaev opened this issue Nov 30, 2023 · 7 comments

Comments

@s-kostyaev
Copy link
Contributor

Hi.
There are many open source solutions mimicking to open ai api. It will be useful if we can use it with llm library. For now there is no working local embedding solution working with llm. Ollama's and llama.cpp are broken now (fix in progress). I wrote simple one myself, but I can't use it with llm. I can also mimic open ai api and use your library. Since there is already supported vss extension for sqlite in 29.2, it will be very useful.

@ahyatt
Copy link
Owner

ahyatt commented Dec 1, 2023

Thanks for opening this. I'm not sure it's a good idea to have an option like that yet for general use; I'd like to be able to explain a more normal use-case that needs it first. I'd like to un-block you, though, so I could perhaps create a branch for this, and when if there's a general need for it, I can merge it into the main branch. Would that be OK?

@s-kostyaev
Copy link
Contributor Author

No need it to unblock me, I wrote elisp to get embeddings from my local api and can wait it to be fixed in llama.cpp and ollama upstream. It can be useful with something like https://github.com/oobabooga/text-generation-webui or https://github.com/BerriAI/litellm

@r0man
Copy link

r0man commented Dec 14, 2023

I'm also interested in this, since I need to go through a proxy that uses mutual TLS (I also need to specify a client certificate and CLI eventually, but that's another storry) to talk to the OpenAI API.

@ahyatt
Copy link
Owner

ahyatt commented Dec 15, 2023

@r0man I recently received another similar request via another channel - do you need an API key for your use? The other request for proxy functionality was basically to use a proxy but the proxy takes care of authentication.

@r0man
Copy link

r0man commented Dec 18, 2023

Hi @ahyatt, yes I would need to use an API key via the Bearer header. That API key is not the one from OpenAI, but one I got assigned by my company. Shouldn't matter though, as long as I can customize it with the the llm-openai struct.

@ahyatt ahyatt closed this as completed in caa1c6b Dec 19, 2023
@ahyatt
Copy link
Owner

ahyatt commented Dec 19, 2023

OK, all, I've pushed a fix for this, which is a new provider, llm-openai-compatible (see the README for more details). Please let me know if that works.

@r0man
Copy link

r0man commented Dec 20, 2023

Thank you @ahyatt. I'm not able to test this at the moment, since I still have an issue with the client certificate I need to use. But it looks good to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants