-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added implementation for Azure OpenAI #966
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
I don't have access to Vercel, can please one of the maintainers look for the issues and let me know what to fix? |
when will these be available in langchain package.. we are waiting for this one. |
I am working on the tests. I think will have them done by tomorrow around this time. Still hoping for some review though 😊 |
So I added a few test for the exports, but honestly the whole experience for running the tests is really a pain for me.
I would really love for someone to please run the tests |
…ers to make experience better
… a more general types definition
2acdaf4
to
1221b0e
Compare
ok, I got it all working with node-gyp and I was able to run all the tests and everything looks good. I also rebased from main. @nfcampos could you please check and merge? |
…deployments can be used for chat, completion and embeddings.
Hi @nfcampos, could you please review this PR and merge it? It is really useful. Thank you very much. |
do we have an eta to enable this function in langchain? |
Has this been tested on chat completion streaming? It seems Azure's response has a single blank line (compared to OpenAI's 2 blank lines) after |
Hi folks, is one of you able to test this? I don't have access to an Azure model to test it with |
I've made some small changes, looks good to me, pending someone being able to test this |
I did run the streaming tests against azure and they looked good to me. But I'd be happy to run others tests, if someone has any specific tests in mind. |
Cool, if you've tested I'm going to merge this now. Thanks a lot for your work on this |
I see the following when trying to build. Were references moved on the latest build? Error : Original Call : import { OpenAI } from "langchain/llms/openai"; |
@acshulk Thanks for reporting, this is fixed in 0.0.71 |
@nfcampos I'm on 0.0.71 and getting the error mentioned by @acshulk still. on both import { OpenAI } from "langchain/llms/openai" and import { OpenAIEmbeddings } from "langchain/embeddings/openai";.
|
@bryantpa That is fixed in 0.0.72 |
Tested on 0.0.71, Azure OpenAI with stream = true, can't get conversation result returned. (hang in await). Once set to false, or use OpenAI, it works. |
Tested in 0.0.72, still same. Using ChatOpenAI with gpt-4-32k |
@baobo5625 sadly I do not have access to a model on azure, so im unable to test this personally. Perhaps @dersia can clarify if he tested that scenario? |
I think it shouldn't be depend on model, 3.5 should also produce the same result. |
|
found the reason, the last message in azure streaming is: {"id":"chatcmpl-7DxBV9AgR8GuvqLyrsmRtDav2M9zJ","object":"chat.completion.chunk","created":1683560125,"model":"gpt-4-32k","choices":[{"index":0,"finish_reason":"stop","delta":{}}],"usage":null} |
it doesn't use the stop word [DONE] |
I tested with my local axios implementation, azure openai does return the "[DONE]" message |
Tried with Azure OpenAI vs OpenAI, the "onmessage" can capture [DONE] from openai, but not the azure one. if not set streaming, then it works. |
@baobo5625 I believe #1167 will fix your issue, any chance you could pull this locally and test it? |
it could be one of the ways to fix it, however I tested out with my local implementation of Azure OpenAI in axios, it does return the [DONE] message. I tested locally and it works. I suggest to put a todo there to find out why |
I added an implementation for using langchain with Azure OpenAI. In generel, it would be sufficient to make the
openAIApiKey
non mandatory by removing the check and throw for it, but I thought it makes sense to add some api to make it more obvious to setup the models to use Azure OpenAI.I added it to the completion models, the embedding model and the chat-model.
All Azure parameters can also be set using envvars.
I also adjusted the docs accordingly.
I haven't yet adjusted the unit-tests and I would like to adjust those, before this is merge.
I know that there is already a PR for Azure OpenAI, but I think it makes more sense to use the same models for Azure OpenAI and OpenAI's API, becuase maintanace would be easier this way and when new features are added, both profit from it. So this PR is not adding new models for Azure, but using the same models, by just extending the model with a few extra parameters and enforcing those, when the Azure OpenAI Api Key is set.
This implementation also throws, if both Api Keys
openAIApiKey
andazureOpenAIApiKey
are missing.