You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
File "D:\workplace\CHATGLM\ChatGLM-6B\tt.py", line 2, in
tokenizer = AutoTokenizer.from_pretrained("../bb-model", trust_remote_code=True)
File "D:\python\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 663, in from_pretrained
tokenizer_class = get_class_from_dynamic_module(
File "D:\python\lib\site-packages\transformers\dynamic_module_utils.py", line 399, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "D:\python\lib\site-packages\transformers\dynamic_module_utils.py", line 177, in get_class_in_module
module = importlib.import_module(module_path)
File "D:\python\lib\importlib_init_.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'transformers_modules.'
Expected Behavior
No response
Steps To Reproduce
运行了文件
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("../bb-model", trust_remote_code=True)
model = AutoModel.from_pretrained("../bb-model", trust_remote_code=True).float()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
Is there an existing issue for this?
Current Behavior
Explicitly passing a
revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.Traceback (most recent call last):
File "D:\workplace\CHATGLM\ChatGLM-6B\tt.py", line 2, in
tokenizer = AutoTokenizer.from_pretrained("../bb-model", trust_remote_code=True)
File "D:\python\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 663, in from_pretrained
tokenizer_class = get_class_from_dynamic_module(
File "D:\python\lib\site-packages\transformers\dynamic_module_utils.py", line 399, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "D:\python\lib\site-packages\transformers\dynamic_module_utils.py", line 177, in get_class_in_module
module = importlib.import_module(module_path)
File "D:\python\lib\importlib_init_.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 972, in _find_and_load_unlocked
File "", line 228, in _call_with_frames_removed
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'transformers_modules.'
Expected Behavior
No response
Steps To Reproduce
运行了文件
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("../bb-model", trust_remote_code=True)
model = AutoModel.from_pretrained("../bb-model", trust_remote_code=True).float()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
Environment
Anything else?
No response
The text was updated successfully, but these errors were encountered: