Skip to content

Commit

Permalink
合并master
Browse files Browse the repository at this point in the history
  • Loading branch information
w_xiaolizu committed Oct 27, 2023
2 parents e5b4668 + f7a332e commit d354ae5
Show file tree
Hide file tree
Showing 41 changed files with 1,961 additions and 210 deletions.
23 changes: 12 additions & 11 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,34 +1,35 @@
# 此Dockerfile适用于“无本地模型”的环境构建,如果需要使用chatglm等本地模型或者latex运行依赖,请参考 docker-compose.yml
# 如何构建: 先修改 `config.py`, 然后 `docker build -t gpt-academic . `
# 如何运行(Linux下): `docker run --rm -it --net=host gpt-academic `
# 如何运行(其他操作系统,选择任意一个固定端口50923): `docker run --rm -it -e WEB_PORT=50923 -p 50923:50923 gpt-academic `
# 此Dockerfile适用于“无本地模型”的迷你运行环境构建
# 如果需要使用chatglm等本地模型或者latex运行依赖,请参考 docker-compose.yml
# - 如何构建: 先修改 `config.py`, 然后 `docker build -t gpt-academic . `
# - 如何运行(Linux下): `docker run --rm -it --net=host gpt-academic `
# - 如何运行(其他操作系统,选择任意一个固定端口50923): `docker run --rm -it -e WEB_PORT=50923 -p 50923:50923 gpt-academic `
FROM python:3.11


# 非必要步骤,更换pip源
# 非必要步骤,更换pip源 (以下三行,可以删除)
RUN echo '[global]' > /etc/pip.conf && \
echo 'index-url = https://mirrors.aliyun.com/pypi/simple/' >> /etc/pip.conf && \
echo 'trusted-host = mirrors.aliyun.com' >> /etc/pip.conf


# 进入工作路径
# 进入工作路径(必要)
WORKDIR /gpt


# 安装大部分依赖,利用Docker缓存加速以后的构建
# 安装大部分依赖,利用Docker缓存加速以后的构建 (以下三行,可以删除)
COPY requirements.txt ./
COPY ./docs/gradio-3.32.2-py3-none-any.whl ./docs/gradio-3.32.2-py3-none-any.whl
COPY ./docs/gradio-3.32.6-py3-none-any.whl ./docs/gradio-3.32.6-py3-none-any.whl
RUN pip3 install -r requirements.txt


# 装载项目文件,安装剩余依赖
# 装载项目文件,安装剩余依赖(必要)
COPY . .
RUN pip3 install -r requirements.txt


# 非必要步骤,用于预热模块
# 非必要步骤,用于预热模块(可以删除)
RUN python3 -c 'from check_proxy import warm_up_modules; warm_up_modules()'


# 启动
# 启动(必要)
CMD ["python3", "-u", "main.py"]
8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
> **Note**
>
> 2023.7.8: Gradio, Pydantic依赖调整,已修改 `requirements.txt`。请及时**更新代码**,安装依赖时,请严格选择`requirements.txt`**指定的版本**
> 2023.10.8: Gradio, Pydantic依赖调整,已修改 `requirements.txt`。请及时**更新代码**,安装依赖时,请严格选择`requirements.txt`**指定的版本**
>
> `pip install -r requirements.txt`
Expand All @@ -16,7 +16,7 @@ To translate this project to arbitrary language with GPT, read and run [`multi_l
>
> 1.请注意只有 **高亮** 标识的函数插件(按钮)才支持读取文件,部分插件位于插件区的**下拉菜单**中。另外我们以**最高优先级**欢迎和处理任何新插件的PR。
>
> 2.本项目中每个文件的功能都在[自译解报告`self_analysis.md`](https://github.com/binary-husky/gpt_academic/wiki/GPT‐Academic项目自译解报告)详细说明。随着版本的迭代,您也可以随时自行点击相关函数插件,调用GPT重新生成项目的自我解析报告。常见问题[`wiki`](https://github.com/binary-husky/gpt_academic/wiki)[安装方法](#installation) | [配置说明](https://github.com/binary-husky/gpt_academic/wiki/%E9%A1%B9%E7%9B%AE%E9%85%8D%E7%BD%AE%E8%AF%B4%E6%98%8E)
> 2.本项目中每个文件的功能都在[自译解报告`self_analysis.md`](https://github.com/binary-husky/gpt_academic/wiki/GPT‐Academic项目自译解报告)详细说明。随着版本的迭代,您也可以随时自行点击相关函数插件,调用GPT重新生成项目的自我解析报告。常见问题[`wiki`](https://github.com/binary-husky/gpt_academic/wiki)[常规安装方法](#installation) | [一键安装脚本](https://github.com/binary-husky/gpt_academic/releases) | [配置说明](https://github.com/binary-husky/gpt_academic/wiki/%E9%A1%B9%E7%9B%AE%E9%85%8D%E7%BD%AE%E8%AF%B4%E6%98%8E)
>
> 3.本项目兼容并鼓励尝试国产大语言模型ChatGLM和Moss等等。支持多个api-key共存,可在配置文件中填写如`API_KEY="openai-key1,openai-key2,azure-key3,api2d-key4"`。需要临时更换`API_KEY`时,在输入区输入临时的`API_KEY`然后回车键提交后即可生效。
Expand Down Expand Up @@ -310,6 +310,8 @@ Tip:不指定文件直接点击 `载入对话历史存档` 可以查看历史h

### II:版本:
- version 3.60(todo): 优化虚空终端,引入code interpreter和更多插件
- version 3.55: 重构前端界面,引入悬浮窗口与菜单栏
- version 3.54: 新增动态代码解释器(Code Interpreter)(待完善)
- version 3.53: 支持动态选择不同界面主题,提高稳定性&解决多用户冲突问题
- version 3.50: 使用自然语言调用本项目的所有函数插件(虚空终端),支持插件分类,改进UI,设计新主题
- version 3.49: 支持百度千帆平台和文心一言
Expand All @@ -331,7 +333,7 @@ Tip:不指定文件直接点击 `载入对话历史存档` 可以查看历史h
- version 2.0: 引入模块化函数插件
- version 1.0: 基础功能

gpt_academic开发者QQ群-2:610599535
GPT Academic开发者QQ群:`610599535`

- 已知问题
- 某些浏览器翻译插件干扰此软件前端的运行
Expand Down
17 changes: 12 additions & 5 deletions comm_tools/core_functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,8 +92,15 @@ def handle_core_functionality(additional_fn, inputs, history, chatbot):
import core_functional
importlib.reload(core_functional) # 热更新prompt
core_functional = core_functional.get_core_functions()
if "PreProcess" in core_functional[additional_fn]: inputs = core_functional[additional_fn]["PreProcess"](inputs) # 获取预处理函数(如果有的话)
inputs = core_functional[additional_fn]["Prefix"] + inputs + core_functional[additional_fn]["Suffix"]
if core_functional[additional_fn].get("AutoClearHistory", False):
history = []
return inputs, history
addition = chatbot._cookies['customize_fn_overwrite']
if additional_fn in addition:
# 自定义功能
inputs = addition[additional_fn]["Prefix"] + inputs + addition[additional_fn]["Suffix"]
return inputs, history
else:
# 预制功能
if "PreProcess" in core_functional[additional_fn]: inputs = core_functional[additional_fn]["PreProcess"](inputs) # 获取预处理函数(如果有的话)
inputs = core_functional[additional_fn]["Prefix"] + inputs + core_functional[additional_fn]["Suffix"]
if core_functional[additional_fn].get("AutoClearHistory", False):
history = []
return inputs, history
26 changes: 16 additions & 10 deletions comm_tools/toolbox.py
Original file line number Diff line number Diff line change
Expand Up @@ -663,11 +663,9 @@ def promote_file_to_downloadzone(file, rename_file=None, chatbot=None):
# 把文件复制过去
if not os.path.exists(new_path): shutil.copyfile(file, new_path)
# 将文件添加到chatbot cookie中,避免多用户干扰
if chatbot:
if 'files_to_promote' in chatbot._cookies:
current = chatbot._cookies['files_to_promote']
else:
current = []
if chatbot is not None:
if 'files_to_promote' in chatbot._cookies: current = chatbot._cookies['files_to_promote']
else: current = []
chatbot._cookies.update({'files_to_promote': [new_path] + current})
return new_path

Expand Down Expand Up @@ -794,12 +792,20 @@ def on_report_generated(cookies, files, chatbot, request):

def load_chat_cookies():
API_KEY, LLM_MODEL, AZURE_API_KEY = get_conf('API_KEY', 'LLM_MODEL', 'AZURE_API_KEY')
DARK_MODE, NUM_CUSTOM_BASIC_BTN = get_conf('DARK_MODE', 'NUM_CUSTOM_BASIC_BTN')
if is_any_api_key(AZURE_API_KEY):
if is_any_api_key(API_KEY):
API_KEY = API_KEY + ',' + AZURE_API_KEY
else:
API_KEY = AZURE_API_KEY
return {'api_key': API_KEY, 'llm_model': LLM_MODEL}
if is_any_api_key(API_KEY): API_KEY = API_KEY + ',' + AZURE_API_KEY
else: API_KEY = AZURE_API_KEY
customize_fn_overwrite_ = {}
for k in range(NUM_CUSTOM_BASIC_BTN):
customize_fn_overwrite_.update({
"自定义按钮" + str(k+1):{
"Title": r"",
"Prefix": r"请在自定义菜单中定义提示词前缀.",
"Suffix": r"请在自定义菜单中定义提示词后缀",
}
})
return {'api_key': API_KEY, 'llm_model': LLM_MODEL, 'customize_fn_overwrite': customize_fn_overwrite_}


def is_openai_api_key(key):
Expand Down
33 changes: 27 additions & 6 deletions config.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
THEME = "Default"
AVAIL_THEMES = ["Default", "Chuanhu-Small-and-Beautiful", "High-Contrast", "Gstaff/Xkcd", "NoCrypt/Miku"]


# 对话窗的高度 (仅在LAYOUT="TOP-DOWN"时生效)
CHATBOT_HEIGHT = 1115

Expand All @@ -58,7 +59,10 @@

# 窗口布局
LAYOUT = "LEFT-RIGHT" # "LEFT-RIGHT"(左右布局) # "TOP-DOWN"(上下布局)
DARK_MODE = True # 暗色模式 / 亮色模式


# 暗色模式 / 亮色模式
DARK_MODE = True


# 发送请求到OpenAI后,等待多久判定为超时
Expand All @@ -81,8 +85,9 @@
LLM_MODEL = "gpt-3.5-turbo" # 可选 ↓↓↓
AVAIL_LLM_MODELS = ["gpt-3.5-turbo-16k", "gpt-3.5-turbo", "azure-gpt-3.5", "api2d-gpt-3.5-turbo",
"gpt-4", "gpt-4-32k", "azure-gpt-4", "api2d-gpt-4", "chatglm", "moss", "newbing", "stack-claude"]
# P.S. 其他可用的模型还包括 ["qianfan", "llama2", "qwen", "gpt-3.5-turbo-0613", "gpt-3.5-turbo-16k-0613",
# "spark", "sparkv2", "chatglm_onnx", "claude-1-100k", "claude-2", "internlm", "jittorllms_pangualpha", "jittorllms_llama"]

# P.S. 其他可用的模型还包括 ["qianfan", "llama2", "qwen", "gpt-3.5-turbo-0613", "gpt-3.5-turbo-16k-0613", "gpt-3.5-random"
# "spark", "sparkv2", "sparkv3", "chatglm_onnx", "claude-1-100k", "claude-2", "internlm", "jittorllms_pangualpha", "jittorllms_llama"]


# 百度千帆(LLM_MODEL="qianfan")
Expand Down Expand Up @@ -121,6 +126,11 @@
CUSTOM_PATH = "/"


# HTTPS 秘钥和证书(不需要修改)
SSL_KEYFILE = ""
SSL_CERTFILE = ""


# 极少数情况下,openai的官方KEY需要伴随组织编码(格式如org-xxxxxxxxxxxxxxxxxxxxxxxx)使用
API_ORG = ""

Expand All @@ -139,7 +149,7 @@
} # 读 docs\use_azure.md key是你的部署名,value请自行计算模型各自最大Token数,如3.5 = 4096 = 1024 * 4即可
AVAIL_LLM_MODELS.extend([f"azure-{i}" for i in AZURE_ENGINE_DICT]) # 自动加入模型列表

# 使用Newbing
# 使用Newbing (不推荐使用,未来将删除)
NEWBING_STYLE = "creative" # ["creative", "balanced", "precise"]
NEWBING_COOKIES = """
put your new bing cookies here
Expand Down Expand Up @@ -176,19 +186,30 @@
# 获取方法:复制以下空间https://huggingface.co/spaces/qingxu98/grobid,设为public,然后GROBID_URL = "https://(你的hf用户名如qingxu98)-(你的填写的空间名如grobid).hf.space"
GROBID_URLS = [
"https://qingxu98-grobid.hf.space","https://qingxu98-grobid2.hf.space","https://qingxu98-grobid3.hf.space",
"https://shaocongma-grobid.hf.space","https://FBR123-grobid.hf.space", "https://yeku-grobid.hf.space",
"https://qingxu98-grobid4.hf.space","https://qingxu98-grobid5.hf.space", "https://qingxu98-grobid6.hf.space",
"https://qingxu98-grobid7.hf.space", "https://qingxu98-grobid8.hf.space",
]


# 是否允许通过自然语言描述修改本页的配置,该功能具有一定的危险性,默认关闭
ALLOW_RESET_CONFIG = False


# 临时的上传文件夹位置,请勿修改
PATH_PRIVATE_UPLOAD = "private_upload"


# 日志文件夹的位置,请勿修改
PATH_LOGGING = "gpt_log"


# 除了连接OpenAI之外,还有哪些场合允许使用代理,请勿修改
WHEN_TO_USE_PROXY = ["Download_LLM", "Download_Gradio_Theme", "Connect_Grobid", "Warmup_Modules"]
WHEN_TO_USE_PROXY = ["Download_LLM", "Download_Gradio_Theme", "Connect_Grobid",
"Warmup_Modules", "Nougat_Download", "AutoGen"]


# 自定义按钮的最大数量限制
NUM_CUSTOM_BASIC_BTN = 4

"""
在线大模型配置关联关系示意图
Expand Down

0 comments on commit d354ae5

Please sign in to comment.