Skip to content

Conversation

@xuyxu
Copy link
Contributor

@xuyxu xuyxu commented Oct 25, 2023

The following code snippet will throw TypeError for now:

import torch
from diffusers import ControlNetModel
controlnet = ControlNetModel.from_single_file(
    "./control_sd15_canny.pth", torch_dtype=torch.float16, local_files_only=True
)

which gives us:

Traceback (most recent call last):
  File "D:\AI\diffusers_test.py", line 5, in <module>
    controlnet = ControlNetModel.from_single_file(
  File "D:\Programming\miniconda3\lib\site-packages\diffusers\loaders.py", line 2665, in from_single_file
    controlnet.to(torch_dtype=torch_dtype)
  File "D:\Programming\miniconda3\lib\site-packages\torch\nn\modules\module.py", line 1141, in to
    device, dtype, non_blocking, convert_to_format = torch._C._nn._parse_to(*args, **kwargs)
TypeError: to() received an invalid combination of arguments - got (torch_dtype=torch.dtype, ), but expected one of:
 * (torch.device device, torch.dtype dtype, bool non_blocking, bool copy, *, torch.memory_format memory_format)
 * (torch.dtype dtype, bool non_blocking, bool copy, *, torch.memory_format memory_format)
 * (Tensor tensor, bool non_blocking, bool copy, *, torch.memory_format memory_format)

The reason is that the parameter name torch_dtype here is wrong, it should be dtype instead of torch_dtype, according to the function signature of the inherited to method from Pytorch module.py.

@patrickvonplaten
Copy link
Contributor

Cool!

@patrickvonplaten patrickvonplaten merged commit dbce14d into huggingface:main Oct 25, 2023
kashif pushed a commit to kashif/diffusers that referenced this pull request Nov 11, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants