-
-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
📑 [DEMO] This is a demo PR of how to add an OpenAI Compatible Provider #2804
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
👍 @arvinxx Thank you for raising your pull request and contributing to our Community |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2804 +/- ##
==========================================
- Coverage 93.25% 93.24% -0.02%
==========================================
Files 380 382 +2
Lines 23795 23876 +81
Branches 2543 1868 -675
==========================================
+ Hits 22190 22262 +72
- Misses 1605 1614 +9
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the configuration information of Provider Config in this file. After adding it, it can be displayed in the configuration list.
在这个文件中添加 Provider Config 的配置信息,添加后就可以显示在配置列表中。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The function of this file is the server implementation called by /chat/xxx
, where it needs to accept the api key passed by the client and the API_KEY
env configured by the server.
这个文件的作用是 /chat/xxx
所调用的服务端实现,在这里需要承接客户端传过来的 api key 和服务端配置的 API_KEY
env
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the server API KEY environment variable here, there are usually two: ENABLED_XXX
and XXXX_API_KEY
在此处添加服务端 API KEY 环境变量,一般会有两个: ENABLED_XXX
和 XXXX_API_KEY
。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
src/config/modelProviders
stores information about this provider, including which models it has, whether proxyURL is enabled, whether it supports pulling the model list, etc.
src/config/modelProviders
中存关于这个 provider 的信息,包括有哪些模型、是否开启 proxyURL,是否支持拉取模型列表等等。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Metadata of this provider
对应 provider 的 元信息
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
libs/agent-runtime
is a module that is planned to be extracted into an independent package in the future. AgentRuntime is a universe runtime. You only need to pass in the corresponding provider name to call the corresponding provider service. Therefore, it is also necessary to add the corresponding input parameters to the file and instantiate the runtime.
libs/agent-runtime
是未来计划抽成独立包的模块,其中的 AgentRuntime 是一个 universe 的运行时,只需要传入对应的 provider 名称即可调用对应的供应商服务。因此同样需要在该文件中补充对应的入参,并实例化 runtime。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The runtime implementation of this model provider. For OpenAI compatible providers, we provide the LobeOpenAICompatibleFactory
class to quickly create the runtime. If it is other specific Provider, such as Wen Xinyiyan, Google, etc., it needs to be implemented specifically.
该模型 provider 的运行时实现,针对 OpenAI 兼容的 provider,我们提供了 LobeOpenAICompatibleFactory
类快速创建运行时,如果是其他特异的 Provider ,例如文心一言、Google 等,则需要专门进行实现。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the model identifier, this ModelProvider is all provider types we support.
添加模型标识符,这个 ModelProvider 是我们所支持的所有 provider 类型。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the module where the server-side configuration is sent to the front-end. When the user configures the API key of the Provider in the env, it is natural to hope that the provider is open by default. Therefore, the status needs to be sent from the server to the client
此处是服务端配置发送到前端的模块,当用户在 env 中配置了 Provider 的 API key,自然希望该 provider 是默认打开的状态。因此需要把该状态从 server 发给 client。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
keyVaults
stores the user's own provider apikey information. In the server-side DB implementation, we will encrypt and store keyVaults
keyVaults
中存储了用户填写的自己的 provider apikey 信息,在服务端 DB 实现中,我们会将 keyVaults 加密存储。
This is a PR example to add an OpenAI interface compatible service provider. I have explained the reason for the change for each file. You can click on Files Changed to view the change points.
Update date: 2024.06.08
这是一个添加 OpenAI 接口兼容服务商的 PR 示例,我针对每一个文件都做了变更原因的说明,可以点到 Files Changed 来查阅变更点。
更新日期:2024.06.08