Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] 添加对 Azure OpenAI API 的支持 #371

Closed
edisonzf2020 opened this issue Apr 2, 2023 · 126 comments
Closed

[Feature] 添加对 Azure OpenAI API 的支持 #371

edisonzf2020 opened this issue Apr 2, 2023 · 126 comments
Labels
documentation Improvements or additions to documentation enhancement New feature or request

Comments

@edisonzf2020
Copy link

No description provided.

@edisonzf2020
Copy link
Author

修改BASE_URL成微软的endpoint,API key修改成微软的key,不能工作。微软还需要设置其他参数。

@tiny656
Copy link

tiny656 commented Apr 2, 2023

azure openai proxy的项目可以搜一下,这个满足你的需求

@ripenedcat
Copy link

#258

@ripenedcat
Copy link

ripenedcat commented Apr 3, 2023

azure openai proxy的项目可以搜一下,这个满足你的需求

感谢思路,使用的https://github.com/diemus/azure-openai-proxy/blob/main/README.zh-cn.md, 将本项目base url替换成docker server url, https改为http后,成功!

@AprilNEA AprilNEA added the duplicate This issue or pull request already exists label Apr 3, 2023
@Yidadaa Yidadaa added the enhancement New feature or request label Apr 3, 2023
@Yidadaa
Copy link
Collaborator

Yidadaa commented Apr 3, 2023

@ripenedcat

感谢反馈,之后我会将此方式补充到 README。

@cl1107
Copy link

cl1107 commented Apr 4, 2023

azure openai proxy的项目可以搜一下,这个满足你的需求

感谢思路,使用的https://github.com/diemus/azure-openai-proxy/blob/main/README.zh-cn.md, 将本项目base url替换成docker server url, https改为http后,成功!

这个项目也不错 https://github.com/stulzq/azure-openai-proxy

@doherty88
Copy link

怎么搞定的?我用了这个azure-openai-proxy项目搭了个proxy,用代码调用已经成功了,但是用chatgpt next web会出错 error code 1003,感觉好像stream方式 azure API不支持?

@ripenedcat
Copy link

怎么搞定的?我用了这个azure-openai-proxy项目搭了个proxy,用代码调用已经成功了,但是用chatgpt next web会出错 error code 1003,感觉好像stream方式 azure API不支持?

看一下chatgpt next web选的model是不是在azure-openai-proxy的环境变量里? 以及common.ts里是否改成了http模式(如果你没有https反代的话)? 具体在看下azure-openai-proxy的log里是否有什么线索呢

@haibbo
Copy link

haibbo commented Apr 5, 2023

@Yidadaa

如果用户没有服务器可以 通过 cloudflare worker

https://github.com/haibbo/cf-openai-azure-proxy

@diemus
Copy link

diemus commented Apr 6, 2023

怎么搞定的?我用了这个azure-openai-proxy项目搭了个proxy,用代码调用已经成功了,但是用chatgpt next web会出错 error code 1003,感觉好像stream方式 azure API不支持?

@doherty88 是我开发的这个吗?https://github.com/diemus/azure-openai-proxy

有问题可以在issue里提供复现场景,有些web项目会调用一些特殊的接口,比如获取model列表,资金余额之类的,Azure是没有的,近期会补上一个mock接口来模拟这类请求的返回。如果是这个问题,临时的解决方案是换一个前端项目,有些项目不会去调类似的接口。另外stream是支持的。

@lzhgus
Copy link

lzhgus commented Apr 6, 2023

怎么搞定的?我用了这个azure-openai-proxy项目搭了个proxy,用代码调用已经成功了,但是用chatgpt next web会出错 error code 1003,感觉好像stream方式 azure API不支持?

我也发现了同样的问题, 用的是cloudflare worker. 问题好像也是和stream有关, 回复的message会keep loading,然后说 “出错了,稍后重试吧”

@doherty88
Copy link

是的,我是用的这个。
现在我用了railway来部署ChatGPT-Next-Web,同样的proxy docker,同样的环境变量设置就可以用了。

@lzhgus
Copy link

lzhgus commented Apr 7, 2023

是的,我是用的这个。 现在我用了railway来部署ChatGPT-Next-Web,同样的proxy docker,同样的环境变量设置就可以用了。

谢谢, 用railway, 我这儿还是有问题。。

{
"cause": {
"errno": -3008,
"code": "ENOTFOUND",
"syscall": "getaddrinfo",
"hostname": "https"
}
}

@doherty88
Copy link

PROTOCOL 这个环境变量设了吗?我的azure代理是http模式,所以我这这个环境变量设成 “http”

@Jer-y
Copy link

Jer-y commented Apr 7, 2023

是的,我是用的这个。 现在我用了railway来部署ChatGPT-Next-Web,同样的proxy docker,同样的环境变量设置就可以用了。

谢谢, 用railway, 我这儿还是有问题。。

{ "cause": { "errno": -3008, "code": "ENOTFOUND", "syscall": "getaddrinfo", "hostname": "https" } }

@lzhgus

@lzhgus
Copy link

lzhgus commented Apr 7, 2023

是的,我是用的这个。 现在我用了railway来部署ChatGPT-Next-Web,同样的proxy docker,同样的环境变量设置就可以用了。

谢谢, 用railway, 我这儿还是有问题。。
{ "cause": { "errno": -3008, "code": "ENOTFOUND", "syscall": "getaddrinfo", "hostname": "https" } }

@lzhgus

感谢回复, 我觉得base url应该是问题,现在已经工作了。 对的, 我也是全部部署在Azure 上的, 用的是Container App。
有一个问题不知道大家是否注意到, 可能是因为Proxy, client side的behavior 改变了, 现在不会像之前一样逐字输出, 而是会略显卡顿的输出或者是一次性显示全部。

@Jer-y
Copy link

Jer-y commented Apr 7, 2023

@lzhgus

感谢回复, 我觉得base url应该是问题,现在已经工作了。 对的, 我也是全部部署在Azure 上的, 用的是Container App。 有一个问题不知道大家是否注意到, 可能是因为Proxy, client side的behavior 改变了, 现在不会像之前一样逐字输出, 而是会略显卡顿的输出或者是一次性显示全部。

我没猜错的话,这个应该是Azure Container App的问题,默认的Container APP用的ingress proxy开启了buffer造成了打字机效果被影响。因为不影响我的使用我倒是没有去深究。如果你想有打字机效果的话还是用其他手段(Azure VM, AKS, etc)自行部署最合适

@diemus
Copy link

diemus commented Apr 7, 2023

未必是代理的问题,azure直连也缺少打字机效果,之前测试过,感觉不用太纠结这个问题

@Jer-y
Copy link

Jer-y commented Apr 7, 2023

你是说PlayGround嘛。哈哈哈哈他们都没加打字机效果的实现,另外就是代码输出都没调试过,把markdown原始内容都返回来了,我已经给开发组开了issue了,不过估计他们暂时不会修。另外我倒是同意,不用太纠结这个问题。又不是不能用.jpg

@diemus
Copy link

diemus commented Apr 7, 2023

不是playground,就是直接请求azure openai,流式返回的时候,azure不是按字返回的,而是一次返回一大段,然后又是一段。感觉在azure那一层就已经不是打字机效果了

@Jer-y
Copy link

Jer-y commented Apr 7, 2023

那么你的判断肯定就没错了,我下周一开个feature的work item给他们,不过,这种功能估计优先级不高,可能会拖。

@Yidadaa Yidadaa added the documentation Improvements or additions to documentation label Apr 7, 2023
@lzhgus
Copy link

lzhgus commented Apr 7, 2023

我相信playground 应该也用了ReadableStream 去处理流式返回和实现打字机效果, 我可以去codebase找找看确认下,不过相较于直接call openai, azure openai service 的 latency 更高, 但是download 时间却大幅变短, 可能因此打字机的效果变得怪异了。
不过确实这也不影响使用 :)

image

image

@haibbo
Copy link

haibbo commented Apr 8, 2023

大家可以试试这个, 它实现了打印机效果

https://github.com/haibbo/cf-openai-azure-proxy

原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

@yinm0591
Copy link
Contributor

yinm0591 commented Apr 9, 2023

大家可以试试这个, 它实现了打印机效果

https://github.com/haibbo/cf-openai-azure-proxy

原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

cf-openai-azure-proxy这个项目太赞了,谢谢~~~

@Jer-y
Copy link

Jer-y commented Apr 11, 2023

@haibbo 感谢分享!星星已送出
@lzhgus 说的是是对的。我周末测了一下并和OpenAI做了比较,Azure的流式返回是一段一段地吐,虽然实际返回速度比原生的要好,但是视觉感官有点顿顿的。另外很奇怪的点是中文/日文/韩文等响应的latency比英文要高很多,这一点很费解。美国那边的PM说用的和OpenAI是同样的方案。我可以体会到追这种问题要多久。anyway,有需求的暂时就前台处理吧

@hbsgithub
Copy link

hbsgithub commented Apr 12, 2023

https://github.com/hbsgithub/deno-azure-openai-proxy
基于@haibbo 的卓越工作,我增加了mapper的功能,并改写成ts脚本部署在deno deploy上,特性如下:

  1. 相比于cloudflare workers,可以无需代理,直连使用
  2. 支持自定义二级域名(*.deno.dev)或者绑定自己的域名
  3. 支持打字机模式的流式响应
  4. 支持mapper,可以自定义模型映射规则,也支持直接透传模型名
  5. 无需服务器,免费在线部署,每月10万次请求额度

欢迎大家使用!

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Sorry, Azure's filtering is outrageous. The word "continue" seems to trigger the filter 100% of the time.
image
Neither will a completely new conversation.
image

@johnsonperl
Copy link

大家可以试试这个, 它实现了打印机效果

https://github.com/haibbo/cf-openai-azure-proxy

原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

Azure 即使开启流式模式,也是文字一块块出来的,这是因为Azure加了一层敏感词过滤。我们找微软的人关闭了这层过滤(在部署的模型选项中,内容筛选器会出现一个Microsoft.Nil的选项,选中就好了),然后效果就和chatgpt一样了。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can try this, it achieves the printer effect

https://github.com/haibbo/cf-openai-azure-proxy

The principle is that although the messages received from Azure OpenAI Service are in pieces, when sending them to the client, I break out the messages one by one and send them to them one by one.

Even if Azure turns on the streaming mode, the text will come out in pieces. This is because Azure has added a layer of sensitive word filtering. We asked someone from Microsoft to turn off this layer of filtering (in the deployed model options, a Microsoft.Nil option will appear in the content filter, just select it), and then the effect will be the same as chatgpt.

@liudhzhyym2
Copy link

不是playground,就是直接请求azure openai,流式返回的时候,azure不是按字返回的,而是一次返回一大段,然后又是一段。感觉在azure那一层就已经不是打字机效果了

建议大家申请 modify content filter。申请完之后就可以取消 content filter 了,一劳永逸。

请教下怎么取消 content filter?

点击我回复上面上面的 modify content filter 文字就行了(我已经把网址嵌入了) 或者直接给你网址吧:https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xURE01NDY1OUhBRzQ3MkQxMUhZSE1ZUlJKTiQlQCN0PWcu

我申请取消 content filter也被拒绝了,用什么理由来申请比较好呢?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


If it is not a playground, it is a direct request to Azure OpenAI. When the stream is returned, Azure does not return it word by word, but returns a large paragraph at a time, and then another paragraph. It feels like it’s no longer a typewriter at the Azure level.

It is recommended that you apply for modify content filter. After applying, you can cancel the content filter once and for all.

Please tell me how to cancel the content filter?

Just click on my reply to the modify content filter text above (I have embedded the URL) or give you the URL directly: https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xURE01NDY1OUhBRzQ3MkQxMUhZSE1ZUlJKTi QlQCN0PWcu

My application to cancel the content filter was also rejected. What is the best reason to apply?

@liudhzhyym2
Copy link

大家可以试试这个, 它实现了打印机效果

请教下你们是用什么理由来申请取消 content filter的呢?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can try this, it achieves the printer effect

Could you please tell me what reasons you used to apply to cancel the content filter?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


https://github.com/hbsgithub/deno-azure-openai-proxy Based on @haibbo’s excellent work, I added the mapper function and rewritten it into a ts script to deploy on deno deploy. The features are as follows:

  1. Compared with cloudflare workers, it can be used directly without proxy
  2. Support customizing second-level domain names (*.deno.dev) or binding your own domain name
  3. Support streaming response in typewriter mode
  4. Support mapper, you can customize model mapping rules, and also support direct transparent transmission of model names.
  5. No server required, free online deployment, 100,000 request quota per month

Everyone is welcome to use it!
It doesn't work. I don't know what the problem is. I used cf to deploy it with cloudflare. I didn't have to go over the wall before. It started to become extremely slow in the past two days and I can't use it today. I can only answer half of the answer using yours and then report an error {
"error": true,
"message": "network error"
}

@yanye99
Copy link

yanye99 commented Oct 28, 2023

https://github.com/hbsgithub/deno-azure-openai-proxy 基于@haibbo 的卓越工作,我增加了mapper的功能,并改写成ts脚本部署在deno deploy上,特性如下:

  1. 相比于cloudflare workers,可以无需代理,直连使用
  2. 支持自定义二级域名(*.deno.dev)或者绑定自己的域名
  3. 支持打字机模式的流式响应
  4. 支持mapper,可以自定义模型映射规则,也支持直接透传模型名
  5. 无需服务器,免费在线部署,每月10万次请求额度

欢迎大家使用!
不太行,不知道什么问题我用cf的那个用cloudflare部署之前都不用翻墙这两天开始巨慢今天直接用不了了,用你这个只能回答一半然后就报错{
"error": true,
"message": "network error"
}
我现在都搞乱了,不知道是托管还是代理的问题

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


https://github.com/hbsgithub/deno-azure-openai-proxy Based on @haibbo’s excellent work, I added the mapper function and rewritten it into a ts script to deploy on deno deploy. The features are as follows:

  1. Compared with cloudflare workers, it can be used directly without proxy.
  2. Support customizing second-level domain names (*.deno.dev) or binding your own domain name
  3. Support streaming response in typewriter mode
  4. Support mapper, you can customize model mapping rules, and also support direct transparent transmission of model names.
  5. No server required, free online deployment, 100,000 request quota per month

Everyone is welcome to use it!
It doesn't work. I don't know what the problem is. I used cf to deploy it with cloudflare. I didn't have to go over the wall before. It started to become extremely slow in the past two days and I can't use it today. I can only answer half of the answer using yours and then report an error {
"error": true,
"message": "network error"
}
I'm all confused now. I don't know if it's a hosting or proxy issue.

@Yidadaa
Copy link
Collaborator

Yidadaa commented Nov 7, 2023

大家好,我需要一个有效的 azure 端点, azure 部署版本号以及 api key 来测试 azure 功能,如果有人能提供这些信息,请发送到我的邮箱 yidadaa@qq.com ,非常感谢!

Hey everyone, I need an azure endpoint, azure deployment version, and API key to test azure functionality. If anyone can provide this information, please send it to my email yidadaa@qq.com. Thank you so much!

@Yidadaa
Copy link
Collaborator

Yidadaa commented Nov 9, 2023

#3206

@Yidadaa Yidadaa closed this as completed Nov 9, 2023
@iamalexblue
Copy link

https://github.com/hbsgithub/deno-azure-openai-proxy 基于@haibbo 的卓越工作,我增加了mapper的功能,并改写成ts脚本部署在deno deploy上,特性如下:

  1. 相比于cloudflare workers,可以无需代理,直连使用
  2. 支持自定义二级域名(*.deno.dev)或者绑定自己的域名
  3. 支持打字机模式的流式响应
  4. 支持mapper,可以自定义模型映射规则,也支持直接透传模型名
  5. 无需服务器,免费在线部署,每月10万次请求额度

欢迎大家使用!

这个项目太强啦,感谢大佬

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


https://github.com/hbsgithub/deno-azure-openai-proxy Based on @haibbo’s excellent work, I added the mapper function and rewritten it into a ts script to deploy on deno deploy. The features are as follows:

  1. Compared with cloudflare workers, it can be used directly without proxy
  2. Support customizing second-level domain names (*.deno.dev) or binding your own domain name
  3. Support streaming response in typewriter mode
  4. Support mapper, you can customize model mapping rules, and also support direct transparent transmission of model names.
  5. No server required, free online deployment, 100,000 request quota per month

Everyone is welcome to use it!

This project is so powerful, thank you boss

@freefish1218
Copy link

大家可以试试这个, 它实现了打印机效果
https://github.com/haibbo/cf-openai-azure-proxy
原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

Azure 即使开启流式模式,也是文字一块块出来的,这是因为Azure加了一层敏感词过滤。我们找微软的人关闭了这层过滤(在部署的模型选项中,内容筛选器会出现一个Microsoft.Nil的选项,选中就好了),然后效果就和chatgpt一样了。

请教大佬,通过什么渠道联系微软才能取消这个过滤?谢谢

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can try this, it achieves the printer effect
https://github.com/haibbo/cf-openai-azure-proxy
The principle is that although the messages received from Azure OpenAI Service are in pieces, when sending them to the client, I separate the messages one by one and send them to them one by one.

Azure Even if the streaming mode is turned on, the text comes out in blocks. This is because Azure has added a layer of sensitive word filtering. We asked someone from Microsoft to turn off this layer of filtering (in the deployed model options, a Microsoft.Nil option will appear in the content filter, just select it), and then the effect will be the same as chatgpt.

Please tell me, through what channel can I contact Microsoft to cancel this filter? Thanks

@H0llyW00dzZ
Copy link
Contributor

大家可以试试这个, 它实现了打印机效果
https://github.com/haibbo/cf-openai-azure-proxy
原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

Azure 即使开启流式模式,也是文字一块块出来的,这是因为Azure加了一层敏感词过滤。我们找微软的人关闭了这层过滤(在部署的模型选项中,内容筛选器会出现一个Microsoft.Nil的选项,选中就好了),然后效果就和chatgpt一样了。

请教大佬,通过什么渠道联系微软才能取消这个过滤?谢谢

It's not possible I think, because it built in models about content filter (known as text-moderation).

@floydchenv
Copy link

好鸡儿难用,就不能给个完整的指引吗

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


It’s hard to use a good chicken, can’t you give me a complete guide?

@Alwaysion
Copy link

真的难用,有个完整的doc吗

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


It’s really difficult to use. Is there a complete doc?

@MTDickens
Copy link

大家可以试试这个, 它实现了打印机效果
https://github.com/haibbo/cf-openai-azure-proxy
原理就是, 从Azure OpenAI Service拿到的消息虽然是一段一段的, 但给客户端的时候我拆出一条条的消息, 依次给.

Azure 即使开启流式模式,也是文字一块块出来的,这是因为Azure加了一层敏感词过滤。我们找微软的人关闭了这层过滤(在部署的模型选项中,内容筛选器会出现一个Microsoft.Nil的选项,选中就好了),然后效果就和chatgpt一样了。

请教大佬,通过什么渠道联系微软才能取消这个过滤?谢谢

https://learn.microsoft.com/en-us/answers/questions/1190822/how-can-we-disable-content-filtering

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can try this, it achieves the printer effect
https://github.com/haibbo/cf-openai-azure-proxy
The principle is that although the messages received from Azure OpenAI Service are segment by segment, when sending them to the client, I separate the messages one by one and send them to them one by one.

Azure even if the streaming mode is turned on, the text will come out in pieces. This is because Azure has added a layer of sensitive word filtering. We asked people from Microsoft to turn off this layer of filtering (in the deployed model options, a Microsoft.Nil option will appear in the content filter, just select it), and then the effect will be the same as chatgpt.

Could you please tell me how to contact Microsoft through which channel to cancel this filtering? Thanks

https://learn.microsoft.com/en-us/answers/questions/1190822/how-can-we-disable-content-filtering

@kehuantiantang
Copy link

kehuantiantang commented May 27, 2024

azure openai proxy的项目可以搜一下,这个满足你的需求

感谢思路,使用的https://github.com/diemus/azure-openai-proxy/blob/main/README.zh-cn.md, 将本项目base url替换成docker server url, https改为http后,成功!

azure openai proxy的项目可以搜一下,这个满足你的需求

感谢思路,使用的https://github.com/diemus/azure-openai-proxy/blob/main/README.zh-cn.md, 将本项目base url替换成docker server url, https改为http后,成功!

请问这个怎么在chatNext客户端设置啊,我服务器没有问题,可以使用diemus项目curl访问,但是ChatNext就会出现,load failed
image

@huangjiahui057
Copy link

不是playground,就是直接请求azure openai,流式返回的时候,azure不是按字返回的,而是一次返回一大段,然后又是一段。感觉在azure那一层就已经不是打字机效果了

请问这个问题有解决吗

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


If it is not a playground, it is a direct request to Azure OpenAI. When the stream is returned, Azure does not return it word by word, but returns a large paragraph at a time, and then another paragraph. It feels like it’s no longer the typewriter effect at the Azure level.

Is this problem solved?

@xwi88
Copy link

xwi88 commented Jul 11, 2024 via email

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


This is the holiday automatic reply email from QQ mailbox.

@.***

Tel: 18311190877

@coderabbitai coderabbitai bot mentioned this issue Jul 26, 2024
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request
Projects
None yet
Development

No branches or pull requests