Skip to content

Commit

Permalink
chore: refine litellm support (#81)
Browse files Browse the repository at this point in the history
  • Loading branch information
iuiaoin committed Aug 9, 2023
1 parent 59e0924 commit 82241c7
Show file tree
Hide file tree
Showing 5 changed files with 35 additions and 26 deletions.
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ config.json
assets/*.png
assets/*.mp4
.ruff_cache
plugins/**/
plugins/**/
litellm_uuid.txt
12 changes: 2 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
src="https://img.shields.io/badge/python-%20%3E%3D%203.8-brightgreen"
/>
</a>
<a href="https://github.com/BerriAI/litellm">
<a href="https://github.com/BerriAI/litellm">
<img
alt="litellm"
src="https://img.shields.io/badge/%20%F0%9F%9A%85%20liteLLM-OpenAI%7CAzure%7CAnthropic%7CPalm%7CCohere-blue?color=green"
Expand All @@ -31,7 +31,7 @@
## 🌟 Features

- [x] **Extremely Stable:** Implement based on windows hook, no worry about risk of wechat account restriction
- [x] **Basic Conversation:** Smart reply for private chat and group chat, support multiple rounds of session context memory, support GPT-3, GPT-3.5, GPT-4, Claude-2, Claude Instant-1, Command Nightly, Palm models
- [x] **Basic Conversation:** Smart reply for private chat and group chat, support multiple rounds of session context memory, support GPT-3, GPT-3.5, GPT-4, Claude-2, Claude Instant-1, Command Nightly, Palm models and other models in [litellm](https://litellm.readthedocs.io/en/latest/supported/)
- [x] **Image Generation:** Support image generation, Dell-E only model for now
- [x] **Flexible Configuration:** Support prompt settings, proxy, command settings and etc.
- [x] **Plugin System:** Support personalized plugin extensions, you can easily integrate the functions you want
Expand Down Expand Up @@ -98,14 +98,6 @@ Then fill in the configuration in `config.json`, the following is the descriptio
}
```

### using litellm supported models
If you're using Claude-2, command-nightly, claude-instant-1
or any of the supported [litellm models](https://litellm.readthedocs.io/en/latest/supported/ ) ensure you set the .env variables
```
os.environ['COHERE_API_KEY']
os.environ['ANTHROPIC_API_KEY']
```

### Running

#### 1. Prepare
Expand Down
8 changes: 7 additions & 1 deletion README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@
src="https://img.shields.io/badge/python-%20%3E%3D%203.8-brightgreen"
/>
</a>
<a href="https://github.com/BerriAI/litellm">
<img
alt="litellm"
src="https://img.shields.io/badge/%20%F0%9F%9A%85%20liteLLM-OpenAI%7CAzure%7CAnthropic%7CPalm%7CCohere-blue?color=green"
/>
</a>
</p>

> 基于 ChatGPT 的微信机器人,无风险且非常稳定! 🚀
Expand All @@ -25,7 +31,7 @@
## 🌟 特性

- [x] **非常稳定:** 基于 windows hook 实现,不用担心微信账号被限制的风险
- [x] **基础对话:** 私聊及群聊的消息智能回复,支持多轮会话上下文记忆,支持 GPT-3,GPT-3.5,GPT-4 模型
- [x] **基础对话:** 私聊及群聊的消息智能回复,支持多轮会话上下文记忆,支持 GPT-3,GPT-3.5,GPT-4, Claude-2, Claude Instant-1, Command Nightly, Palm models 和其他在 [litellm](https://litellm.readthedocs.io/en/latest/supported/) 中的模型
- [x] **图片生成:** 支持图片生成, 目前暂时只支持 Dell-E 模型
- [x] **灵活配置:** 支持 prompt 设置, proxy, 命令设置等.
- [x] **插件系统:** 支持个性化插件扩展,您可以轻松集成您想要的功能
Expand Down
18 changes: 11 additions & 7 deletions bot/bot.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,22 +5,26 @@
from common.reply import Reply



@singleton
class Bot:
def __init__(self):
use_azure_chatgpt = conf().get("use_azure_chatgpt", False)
model = conf().get("model", "gpt-3.5-turbo")
if use_azure_chatgpt:
from bot.azure_chatgpt import AzureChatGPTBot

self.bot = AzureChatGPTBot()
elif model in litellm.model_list:
# see litellm supported models here:
# https://litellm.readthedocs.io/en/latest/supported/
from bot.litellm import liteLLMChatGPTBot
self.bot = liteLLMChatGPTBot()
else:

elif model in litellm.open_ai_chat_completion_models:
from bot.chatgpt import ChatGPTBot

self.bot = ChatGPTBot()
else:
# see litellm supported models here:
# https://litellm.readthedocs.io/en/latest/supported/
from bot.litellm import LiteLLMChatGPTBot

self.bot = LiteLLMChatGPTBot()

def reply(self, context: Context) -> Reply:
return self.bot.reply(context)
20 changes: 13 additions & 7 deletions bot/litellm.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,23 +4,29 @@
from litellm import completion
from utils.log import logger
from config import conf
import os

class liteLLMChatGPTBot(ChatGPTBot):
def __init__(self):
openai.api_key = conf().get("openai_api_key")
os.environ['OPENAI_API_KEY'] = openai.api_key # litellm reads env variables for keys

# extra litellm configs:
class LiteLLMChatGPTBot(ChatGPTBot):
def __init__(self):
api_key = conf().get("openai_api_key")
model = conf().get("model", "gpt-3.5-turbo")
api_base = conf().get("openai_api_base")
proxy = conf().get("proxy")

if model in litellm.cohere_models:
litellm.cohere_key = api_key
elif model in litellm.anthropic_models:
litellm.anthropic_key = api_key
else:
litellm.openai_key = api_key

if api_base:
litellm.api_base = api_base
if proxy:
openai.proxy = proxy
self.name = self.__class__.__name__
self.args = {
"model": conf().get("model"),
"model": model,
"temperature": conf().get("temperature"),
}

Expand Down

0 comments on commit 82241c7

Please sign in to comment.