Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error occurred when executing LoadChatGLM3 #16

Open
HEITAOKAKA opened this issue Jul 8, 2024 · 1 comment
Open

Error occurred when executing LoadChatGLM3 #16

HEITAOKAKA opened this issue Jul 8, 2024 · 1 comment

Comments

@HEITAOKAKA
Copy link

Error occurred when executing LoadChatGLM3:

Only Tensors of floating point and complex dtype can require gradients

File "I:\AI\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\AI\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\AI\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\nodes.py", line 124, in loadmodel
text_encoder.quantize(8)
File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\modeling_chatglm.py", line 852, in quantize
quantize(self.encoder, weight_bit_width)
File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\quantization.py", line 155, in quantize
layer.self_attention.query_key_value = QuantizedLinear(
^^^^^^^^^^^^^^^^
File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\quantization.py", line 141, in init
self.weight = Parameter(self.weight.to(device), requires_grad=False)
^^^^^^^^^^^
File "I:\AI\ComfyUI.ext\Lib\site-packages\torch\nn\modules\module.py", line 1726, in setattr
self.register_parameter(name, value)
File "I:\AI\ComfyUI.ext\Lib\site-packages\accelerate\big_modeling.py", line 123, in register_empty_parameter
module._parameters[name] = param_cls(module._parameters[name].to(device), **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\AI\ComfyUI.ext\Lib\site-packages\torch\nn\parameter.py", line 40, in new
return torch.Tensor._make_subclass(cls, data, requires_grad)

@touchwolf
Copy link

screenshot-20240717-152649

I encountered the same issue. I resolved it by changing the "load chatglm3 model" node to "(download) load chatglm3 model" node, without changing the model save location. After this modification, everything worked fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants