-
-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ feat: support Google / Zhipu / AWS Bedrock model providers #1173
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
👍 @arvinxx Thank you for raising your pull request and contributing to our Community |
我觉得也许可以封装为一个 openai-adapter 的 npm 包,也许别的项目也会用到,再用适配器为每一个模型做适配,类似于 axios 的 adapter 有人做了一个这个 llm-adapter https://github.com/fardjad/node-llmatic/blob/master/src/llm-adapter.ts#L71 |
I think it may be encapsulated into an openai-adapter npm package, which may also be used by other projects, and then use an adapter to adapt each model, similar to the adapter of axios |
个人感觉可能意义不是特别大。比如你去看 ai sdk 对于多种服务商的接入介绍 https://sdk.vercel.ai/docs/guides/providers/mistral 每种接入基本上都不用花太多成本,而且甚至有不少直接用 openai 都可以。 反而是封装了以后会丧失灵活性 |
Personally, I don't think it means much. For example, if you look at ai sdk’s introduction to access to various service providers https://sdk.vercel.ai/docs/guides/providers/mistral Basically, each type of access does not cost much, and many of them can even be used directly with openai. On the contrary, after encapsulation, flexibility will be lost. |
我用的 one-api,支持把大部分的 LLM 转成 openai sdk 的协议。 |
The one-api I use supports converting most LLM to openai sdk protocols. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这样的话就还是会走服务器了,本地 LLM 后面就可能再重新实现一遍,做一个纯前端的大模型适配器可能会好一点,在前端将所有模型适配为 OpenAI 的协议
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
纯前端实现还可以支持本地代理
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我是想分两步走,第一步是原有的后端实现改造,第二步是再在前端实现一遍对应的逻辑(通用部分会抽到 agent-runtime
去),这样用户可以自行选择是走前端还是后端。
refs:
这个 PR 由于暂不计划包含本地 LLM 的方案,所以这一期计划是先把服务端的部分改造完
6e36353
to
927ace3
Compare
7a8777c
to
509b064
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1173 +/- ##
==========================================
- Coverage 91.29% 86.78% -4.51%
==========================================
Files 184 211 +27
Lines 8858 10355 +1497
Branches 1070 1133 +63
==========================================
+ Hits 8087 8987 +900
- Misses 771 1368 +597 ☔ View full report in Codecov by Sentry. |
058efff
to
7455fdc
Compare
❤️ Great PR @arvinxx ❤️ The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world. |
## [Version 0.123.0](v0.122.9...v0.123.0) <sup>Released on **2024-02-05**</sup> #### ✨ Features - **misc**: Support Google / Zhipu / AWS Bedrock model providers. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Support Google / Zhipu / AWS Bedrock model providers, closes [#1173](#1173) ([d5929f6](d5929f6)) </details> <div align="right"> [![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top) </div>
🎉 This PR is included in version 0.123.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
## [Version 1.5.0](v1.4.11...v1.5.0) <sup>Released on **2024-02-05**</sup> #### ✨ Features - **misc**: Support Google / Zhipu / AWS Bedrock model providers. #### 🐛 Bug Fixes - **misc**: Fix rename. #### 💄 Styles - **settings**: Improve LLM connection checker style. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Support Google / Zhipu / AWS Bedrock model providers, closes [lobehub#1173](https://github.com/bentwnghk/lobe-chat/issues/1173) ([d5929f6](d5929f6)) #### What's fixed * **misc**: Fix rename ([f6ecdff](f6ecdff)) #### Styles * **settings**: Improve LLM connection checker style, closes [lobehub#1222](https://github.com/bentwnghk/lobe-chat/issues/1222) ([8c349a1](8c349a1)) </details> <div align="right"> [![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top) </div>
Needed support for Google Gemini 1 and PaLM 2 Models from Vertex AI Studio like Azure OpenAI Service not the ones that are already added from Google AI Studio like OpenAI Platform. |
## [Version 0.123.0](lobehub/lobe-chat@v0.122.9...v0.123.0) <sup>Released on **2024-02-05**</sup> #### ✨ Features - **misc**: Support Google / Zhipu / AWS Bedrock model providers. <br/> <details> <summary><kbd>Improvements and Fixes</kbd></summary> #### What's improved * **misc**: Support Google / Zhipu / AWS Bedrock model providers, closes [#1173](lobehub/lobe-chat#1173) ([d5929f6](lobehub/lobe-chat@d5929f6)) </details> <div align="right"> [![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top) </div>
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
支持多模型服务商能力,计划一期支持:
本期目标:
变更项:
InvalidXXXAPIKey
类型出错的提示面板,支持不同 Provider 下的配置面板,详情:[RFC] 018 - 多模型服务商一期:架构设计 & AWS Bedrock / Zhipu / Gemini / Moonshot 支持 #737 (reply in thread)📝 补充信息 | Additional Information
refs: