Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.10版本是否支持BlueLM-7B-Chat,BlueLM-7B-Chat-32k 按默认的配置设置路径后,python startup.py -a无法启动 #4037

Open
caixianyu opened this issue May 17, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@caixianyu
Copy link

(Langchain-Chatchat) tcarh@K5RCPVT45N2DX5K:~/Langchain-Chatchat-0.2.10$ python startup.py -a

==============================Langchain-Chatchat Configuration==============================
操作系统:Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35.
python版本:3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0]
项目版本:v0.2.10
langchain版本:0.0.354. fastchat版本:0.2.35

当前使用的分词器:ChineseRecursiveTextSplitter
当前启动的LLM模型:['BlueLM-7B-Chat', 'zhipu-api', 'openai-api'] @ cuda
{'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_path': '/home/tcarh/langchain-ChatGLM/BlueLM-7B-Chat',
'model_path_exists': True,
'port': 20002}
{'api_key': '',
'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'online_api': True,
'port': 21001,
'provider': 'ChatGLMWorker',
'version': 'glm-4',
'worker_class': <class 'server.model_workers.zhipu.ChatGLMWorker'>}
{'api_base_url': 'https://api.openai.com/v1',
'api_key': '',
'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_name': 'gpt-4',
'online_api': True,
'openai_proxy': '',
'port': 20002}
当前Embbedings模型: bge-large-zh-v1.5 @ cuda
==============================Langchain-Chatchat Configuration==============================

2024-05-17 09:32:41,222 - startup.py[line:655] - INFO: 正在启动服务:
2024-05-17 09:32:41,222 - startup.py[line:656] - INFO: 如需查看 llm_api 日志,请前往 /home/tcarh/Langchain-Chatchat-0.2.10/logs
/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: 模型启动功能将于 Langchain-Chatchat 0.3.x重写,支持更多模式和加速启动,0.2.x中相关功能将废弃
warn_deprecated(
2024-05-17 09:32:47 | INFO | model_worker | Register to controller
2024-05-17 09:32:47 | ERROR | stderr | INFO: Started server process [211407]
2024-05-17 09:32:47 | ERROR | stderr | INFO: Waiting for application startup.
2024-05-17 09:32:47 | ERROR | stderr | INFO: Application startup complete.
2024-05-17 09:32:47 | ERROR | stderr | INFO: Uvicorn running on http://0.0.0.0:20000 (Press CTRL+C to quit)
2024-05-17 09:32:47 | INFO | model_worker | Loading the model ['BlueLM-7B-Chat'] on worker 4a9aa8ec ...
2024-05-17 09:32:48 | ERROR | stderr | Process model_worker - BlueLM-7B-Chat:
2024-05-17 09:32:48 | ERROR | stderr | Traceback (most recent call last):
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
2024-05-17 09:32:48 | ERROR | stderr | self.run()
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/multiprocessing/process.py", line 108, in run
2024-05-17 09:32:48 | ERROR | stderr | self._target(*self._args, **self._kwargs)
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/Langchain-Chatchat-0.2.10/startup.py", line 389, in run_model_worker
2024-05-17 09:32:48 | ERROR | stderr | app = create_model_worker_app(log_level=log_level, **kwargs)
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/Langchain-Chatchat-0.2.10/startup.py", line 217, in create_model_worker_app
2024-05-17 09:32:48 | ERROR | stderr | worker = ModelWorker(
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/serve/model_worker.py", line 77, in init
2024-05-17 09:32:48 | ERROR | stderr | self.model, self.tokenizer = load_model(
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/model/model_adapter.py", line 348, in load_model
2024-05-17 09:32:48 | ERROR | stderr | model, tokenizer = adapter.load_model(model_path, kwargs)
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/model/model_adapter.py", line 826, in load_model
2024-05-17 09:32:48 | ERROR | stderr | model = AutoModel.from_pretrained(
2024-05-17 09:32:48 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 569, in from_pretrained
2024-05-17 09:32:48 | ERROR | stderr | raise ValueError(
2024-05-17 09:32:48 | ERROR | stderr | ValueError: Unrecognized configuration class <class 'transformers_modules.BlueLM-7B-Chat.configuration_bluelm.BlueLMConfig'> for this kind of AutoModel: AutoModel.
2024-05-17 09:32:48 | ERROR | stderr | Model type should be one of AlbertConfig, AlignConfig, AltCLIPConfig, ASTConfig, AutoformerConfig, BarkConfig, BartConfig, BeitConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BitConfig, BlenderbotConfig, BlenderbotSmallConfig, BlipConfig, Blip2Config, BloomConfig, BridgeTowerConfig, BrosConfig, CamembertConfig, CanineConfig, ChineseCLIPConfig, ClapConfig, CLIPConfig, CLIPVisionConfig, CLIPSegConfig, ClvpConfig, LlamaConfig, CodeGenConfig, ConditionalDetrConfig, ConvBertConfig, ConvNextConfig, ConvNextV2Config, CpmAntConfig, CTRLConfig, CvtConfig, Data2VecAudioConfig, Data2VecTextConfig, Data2VecVisionConfig, DebertaConfig, DebertaV2Config, DecisionTransformerConfig, DeformableDetrConfig, DeiTConfig, DetaConfig, DetrConfig, DinatConfig, Dinov2Config, DistilBertConfig, DonutSwinConfig, DPRConfig, DPTConfig, EfficientFormerConfig, EfficientNetConfig, ElectraConfig, EncodecConfig, ErnieConfig, ErnieMConfig, EsmConfig, FalconConfig, FlaubertConfig, FlavaConfig, FNetConfig, FocalNetConfig, FSMTConfig, FunnelConfig, GitConfig, GLPNConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GPTSanJapaneseConfig, GraphormerConfig, GroupViTConfig, HubertConfig, IBertConfig, IdeficsConfig, ImageGPTConfig, InformerConfig, JukeboxConfig, Kosmos2Config, LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config, LEDConfig, LevitConfig, LiltConfig, LlamaConfig, LongformerConfig, LongT5Config, LukeConfig, LxmertConfig, M2M100Config, MarianConfig, MarkupLMConfig, Mask2FormerConfig, MaskFormerConfig, MaskFormerSwinConfig, MBartConfig, MCTCTConfig, MegaConfig, MegatronBertConfig, MgpstrConfig, MistralConfig, MixtralConfig, MobileBertConfig, MobileNetV1Config, MobileNetV2Config, MobileViTConfig, MobileViTV2Config, MPNetConfig, MptConfig, MraConfig, MT5Config, MvpConfig, NatConfig, NezhaConfig, NllbMoeConfig, NystromformerConfig, OneFormerConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, Owlv2Config, OwlViTConfig, PatchTSMixerConfig, PatchTSTConfig, PegasusConfig, PegasusXConfig, PerceiverConfig, PersimmonConfig, PhiConfig, PLBartConfig, PoolFormerConfig, ProphetNetConfig, PvtConfig, QDQBertConfig, ReformerConfig, RegNetConfig, RemBertConfig, ResNetConfig, RetriBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SamConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SegformerConfig, SEWConfig, SEWDConfig, Speech2TextConfig, SpeechT5Config, SplinterConfig, SqueezeBertConfig, SwiftFormerConfig, SwinConfig, Swin2SRConfig, Swinv2Config, SwitchTransformersConfig, T5Config, TableTransformerConfig, TapasConfig, TimeSeriesTransformerConfig, TimesformerConfig, TimmBackboneConfig, TrajectoryTransformerConfig, TransfoXLConfig, TvltConfig, TvpConfig, UMT5Config, UniSpeechConfig, UniSpeechSatConfig, UnivNetConfig, VanConfig, VideoMAEConfig, ViltConfig, VisionTextDualEncoderConfig, VisualBertConfig, ViTConfig, ViTHybridConfig, ViTMAEConfig, ViTMSNConfig, VitDetConfig, VitsConfig, VivitConfig, Wav2Vec2Config, Wav2Vec2ConformerConfig, WavLMConfig, WhisperConfig, XCLIPConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, YolosConfig, YosoConfig.

@caixianyu caixianyu added the bug Something isn't working label May 17, 2024
Copy link

dosubot bot commented May 17, 2024

To address the ValueError you're encountering with the BlueLM-7B-Chat model in Langchain-Chatchat version 0.2.10, ensure that the model identifier for BlueLM-7B-Chat is correctly specified. The correct identifier should be "vivo-ai/BlueLM-7B-Chat" in your project's configuration, specifically in the model_config.py or wherever the model is being loaded.

Additionally, confirm that you're using the correct version of the transformers library, which should be 4.37.2 for compatibility with BlueLM-7B-Chat. You can update your transformers library with pip install transformers==4.37.2.

Ensure that your model_config.py (or the relevant configuration file) correctly references the BlueLM-7B-Chat model with the entry "BlueLM-7B-Chat": "vivo-ai/BlueLM-7B-Chat". This is crucial for the model to be recognized and loaded properly.

If these steps do not resolve the issue, please check that the model is correctly available on the model hosting platform and that your environment can access it. If modifications were made to the configuration or code related to model loading, reverting to a known good configuration may also help isolate the issue.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@caixianyu
Copy link
Author

caixianyu commented May 17, 2024

我按你的方法安装transformers=4.37.2,还是报一样的错误。
Successfully installed transformers-4.37.2
WARNING: There was an error checking the latest version of pip.
(Langchain-Chatchat) tcarh@K5RCPVT45N2DX5K:~/Langchain-Chatchat-0.2.10$ python startup.py -a

==============================Langchain-Chatchat Configuration==============================
操作系统:Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35.
python版本:3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:40:32) [GCC 12.3.0]
项目版本:v0.2.10
langchain版本:0.0.354. fastchat版本:0.2.35

当前使用的分词器:ChineseRecursiveTextSplitter
当前启动的LLM模型:['BlueLM-7B-Chat', 'zhipu-api', 'openai-api'] @ cuda
{'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_path': '/home/tcarh/langchain-ChatGLM/BlueLM-7B-Chat',
'model_path_exists': True,
'port': 20002}
{'api_key': '',
'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'online_api': True,
'port': 21001,
'provider': 'ChatGLMWorker',
'version': 'glm-4',
'worker_class': <class 'server.model_workers.zhipu.ChatGLMWorker'>}
{'api_base_url': 'https://api.openai.com/v1',
'api_key': '',
'device': 'cuda',
'host': '0.0.0.0',
'infer_turbo': False,
'model_name': 'gpt-4',
'online_api': True,
'openai_proxy': '',
'port': 20002}
当前Embbedings模型: bge-large-zh-v1.5 @ cuda
==============================Langchain-Chatchat Configuration==============================

2024-05-17 14:00:30,369 - startup.py[line:655] - INFO: 正在启动服务:
2024-05-17 14:00:30,369 - startup.py[line:656] - INFO: 如需查看 llm_api 日志,请前往 /home/tcarh/Langchain-Chatchat-0.2.10/logs
/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: 模型启动功能将于 Langchain-Chatchat 0.3.x重写,支持更多模式和加速启动,0.2.x中相关功能将废弃
warn_deprecated(
2024-05-17 14:00:36 | INFO | model_worker | Register to controller
2024-05-17 14:00:36 | ERROR | stderr | INFO: Started server process [211857]
2024-05-17 14:00:36 | ERROR | stderr | INFO: Waiting for application startup.
2024-05-17 14:00:36 | ERROR | stderr | INFO: Application startup complete.
2024-05-17 14:00:36 | ERROR | stderr | INFO: Uvicorn running on http://0.0.0.0:20000 (Press CTRL+C to quit)
2024-05-17 14:00:37 | INFO | model_worker | Loading the model ['BlueLM-7B-Chat'] on worker 90e044e6 ...
2024-05-17 14:00:37 | ERROR | stderr | Process model_worker - BlueLM-7B-Chat:
2024-05-17 14:00:37 | ERROR | stderr | Traceback (most recent call last):
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
2024-05-17 14:00:37 | ERROR | stderr | self.run()
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/multiprocessing/process.py", line 108, in run
2024-05-17 14:00:37 | ERROR | stderr | self._target(*self._args, **self._kwargs)
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/Langchain-Chatchat-0.2.10/startup.py", line 389, in run_model_worker
2024-05-17 14:00:37 | ERROR | stderr | app = create_model_worker_app(log_level=log_level, **kwargs)
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/Langchain-Chatchat-0.2.10/startup.py", line 217, in create_model_worker_app
2024-05-17 14:00:37 | ERROR | stderr | worker = ModelWorker(
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/serve/model_worker.py", line 77, in init
2024-05-17 14:00:37 | ERROR | stderr | self.model, self.tokenizer = load_model(
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/model/model_adapter.py", line 348, in load_model
2024-05-17 14:00:37 | ERROR | stderr | model, tokenizer = adapter.load_model(model_path, kwargs)
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/fastchat/model/model_adapter.py", line 826, in load_model
2024-05-17 14:00:37 | ERROR | stderr | model = AutoModel.from_pretrained(
2024-05-17 14:00:37 | ERROR | stderr | File "/home/tcarh/anaconda3/envs/Langchain-Chatchat/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 569, in from_pretrained
2024-05-17 14:00:37 | ERROR | stderr | raise ValueError(
2024-05-17 14:00:37 | ERROR | stderr | ValueError: Unrecognized configuration class <class 'transformers_modules.BlueLM-7B-Chat.configuration_bluelm.BlueLMConfig'> for this kind of AutoModel: AutoModel.

配置文件也检查了,模型下载在本地的。配置应该没问题,切换到本地chatglm3是不会报错的。
EMBEDDING_MODEL = "bge-large-zh-v1.5"

Embedding 模型运行设备。设为 "auto" 会自动检测(会有警告),也可手动设定为 "cuda","mps","cpu","xpu" 其中之一。

EMBEDDING_DEVICE = "auto"

选用的reranker模型

RERANKER_MODEL = "bge-reranker-large"

是否启用reranker模型

USE_RERANKER = False
RERANKER_MAX_LENGTH = 1024

如果需要在 EMBEDDING_MODEL 中增加自定义的关键字时配置

EMBEDDING_KEYWORD_FILE = "keywords.txt"
EMBEDDING_MODEL_OUTPUT_PATH = "output"

要运行的 LLM 名称,可以包括本地模型和在线模型。列表中本地模型将在启动项目时全部加载。

列表中第一个模型将作为 API 和 WEBUI 的默认模型。

在这里,我们使用目前主流的两个离线模型,其中,chatglm3-6b 为默认加载模型。

如果你的显存不足,可使用 Qwen-1_8B-Chat, 该模型 FP16 仅需 3.8G显存。

LLM_MODELS = ["BlueLM-7B-Chat", "zhipu-api", "openai-api"]
Agent_MODEL = None

LLM 模型运行设备。设为"auto"会自动检测(会有警告),也可手动设定为 "cuda","mps","cpu","xpu" 其中之一。

LLM_DEVICE = "cuda"

"llm_model": {
"chatglm2-6b": "THUDM/chatglm2-6b",
"chatglm2-6b-32k": "THUDM/chatglm2-6b-32k",
"chatglm3-6b": "THUDM/chatglm3-6b",
"chatglm3-6b-32k": "/home/tcarh/langchain-ChatGLM/chatglm3-6b-32k",

    "Orion-14B-Chat": "OrionStarAI/Orion-14B-Chat",
    "Orion-14B-Chat-Plugin": "OrionStarAI/Orion-14B-Chat-Plugin",
    "Orion-14B-LongChat": "OrionStarAI/Orion-14B-LongChat",

    "Llama-2-7b-chat-hf": "meta-llama/Llama-2-7b-chat-hf",
    "Llama-2-13b-chat-hf": "meta-llama/Llama-2-13b-chat-hf",
    "Llama-2-70b-chat-hf": "meta-llama/Llama-2-70b-chat-hf",

    "Qwen-1_8B-Chat": "Qwen/Qwen-1_8B-Chat",
    "Qwen1.5-7B-Chat": "/home/tcarh/langchain-ChatGLM/Qwen1.5-7B-Chat",
    "Qwen-14B-Chat": "Qwen/Qwen-14B-Chat",
    "Qwen-72B-Chat": "Qwen/Qwen-72B-Chat",

    "baichuan-7b-chat": "baichuan-inc/Baichuan-7B-Chat",
    "baichuan-13b-chat": "baichuan-inc/Baichuan-13B-Chat",
    "baichuan2-7b-chat": "baichuan-inc/Baichuan2-7B-Chat",
    "baichuan2-13b-chat": "baichuan-inc/Baichuan2-13B-Chat",

    "internlm-7b": "internlm/internlm-7b",
    "internlm-chat-7b": "internlm/internlm-chat-7b",
    "internlm2-chat-7b": "internlm/internlm2-chat-7b",
    "internlm2-chat-20b": "internlm/internlm2-chat-20b",

    **"BlueLM-7B-Chat": "/home/tcarh/langchain-ChatGLM/BlueLM-7B-Chat",
    "BlueLM-7B-Chat-32k":"/home/tcarh/langchain-ChatGLM/BlueLM-7B-Chat-32k",**

@dosu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant