Skip to content

Commit

Permalink
v0.4.4
Browse files Browse the repository at this point in the history
  • Loading branch information
Tongjilibo committed Dec 28, 2023
1 parent 68fffc6 commit f53c601
Show file tree
Hide file tree
Showing 7 changed files with 10 additions and 4 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ pip install git+https://github.com/Tongjilibo/bert4torch

|更新日期| bert4torch | torch4keras | 版本说明 |
|------| ---------------- | ----------------- |----------- |
|20231228| 0.4.4 | 0.1.7|新增`pipelines`模块,把`chat`整理进去,并新增`Text2Vec`模块用于向量生成,新增`snapshot_download`用于hf模型下载|
|20231224| 0.4.3 | 0.1.7|`chat`中增加常见chat模型, 简化大模型调用的代码逻辑|
|20231219| 0.4.2 | 0.1.7|参数`checkpoint_path`支持传入文件夹地址,增加`chat`模块用于快速发布demo/api, 支持加载`.safetensors`, `meta`的device提示报错|
|20231210| 0.4.1 | 0.1.6.post2|增加longlora, 增加test模块,适配torch4keras==0.1.6(监控fit过程,有报错则发送邮件提醒; 解决torch2.0的compile冲突问题; 修复clip_grad_norm的bug)|
Expand All @@ -86,6 +87,7 @@ pip install git+https://github.com/Tongjilibo/bert4torch
[更多版本](https://github.com/Tongjilibo/bert4torch/blob/master/docs/Update.md)

## 5. 更新历史:
- **20231228**:新增`pipelines`模块,把`chat`整理进去,并新增`Text2Vec`模块用于向量生成,新增`snapshot_download`用于hf模型下载
- **20231224**:在`chat`中增加常见chat模型, 简化大模型调用的代码逻辑
- **20231219**:参数`checkpoint_path`支持传入文件夹地址,增加`chat`模块用于快速发布demo/api, 支持加载`.safetensors`, `meta`的device提示报错
- **20231210**:增加longlora, 增加test模块,适配torch4keras==0.1.6(监控fit过程,有报错则发送邮件提醒; 解决torch2.0的compile冲突问题; 修复clip_grad_norm的bug)
Expand Down
4 changes: 4 additions & 0 deletions bert4torch/pipelines/chatllm.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,6 +227,7 @@ class ChatDemo(InputModel, ChatWeb):
from typing import Any, Dict, List, Literal, Optional, Union
from bert4torch.snippets import log_info, log_warn, cuda_empty_cache, AnyClass
from bert4torch.snippets import is_fastapi_available, is_pydantic_available, is_sseclient_available
from packaging import version

FastAPI, BaseModel, Field= object, object, AnyClass
if is_fastapi_available():
Expand Down Expand Up @@ -308,6 +309,9 @@ def __init__(self, model_path, name='default_model', route_api='/chat', route_mo
super().__init__(model_path, **kwargs)
assert is_fastapi_available(), "No module found, use `pip install fastapi`"
from sse_starlette.sse import ServerSentEvent, EventSourceResponse
import sse_starlette
if version.parse(sse_starlette.__version__) > version.parse('1.8'):
log_warn('Module `sse_starlette` above 1.8 not support stream output')
self.EventSourceResponse = EventSourceResponse
self.name = name
self.role_user = 'user'
Expand Down
2 changes: 1 addition & 1 deletion bert4torch/snippets/import_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,4 +69,4 @@ def is_trl_available():


def is_sseclient_available():
return is_package_available("sseclient")
return importlib.util.find_spec("sseclient")
Binary file modified docs/pics/wechat_group.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion examples/basic/glm/basic_language_model_chatglm2.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
demo = ChatGlm2Cli(model_path, **generation_config)

if __name__ == '__main__':
choice = 'gen_1toN' # cli, gen_1toN
choice = 'cli' # cli, gen_1toN

if choice == 'cli':
# 命令行demo
Expand Down
2 changes: 1 addition & 1 deletion examples/basic/llama/basic_language_model_llama-2.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"""


choice = 'llama-2-7b'
choice = 'llama-2-7b-chat'
if choice == 'llama-2-7b':
dir_path = 'E:/pretrain_ckpt/llama/llama-2-7b'
with_prompt = False
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@

setup(
name='bert4torch',
version='v0.4.3',
version='v0.4.4',
description='an elegant bert4torch',
long_description=long_description,
long_description_content_type="text/markdown",
Expand Down

0 comments on commit f53c601

Please sign in to comment.