Skip to content
This repository has been archived by the owner on Oct 12, 2023. It is now read-only.

Issues with Inpainting pipeline #59

Open
juancopi81 opened this issue Sep 25, 2023 · 0 comments
Open

Issues with Inpainting pipeline #59

juancopi81 opened this issue Sep 25, 2023 · 0 comments

Comments

@juancopi81
Copy link

Hi! Thanks for the great node. I could use it in the example and it works very well: I had to change:

self.control_model_wrapped = comfy.sd.ModelPatcher(self.control_model, load_device=comfy.model_management.get_torch_device(), offload_device=comfy.model_management.unet_offload_device())

for

self.control_model_wrapped = comfy.model_patcher.ModelPatcher(self.control_model, load_device=comfy.model_management.get_torch_device(), offload_device=comfy.model_management.unet_offload_device())

However, I am trying to use it in a workflow that I have that has Inpainting and ControlNet. I have issues with the inpainting model and with controlnet.

  1. When I use the node with a non-inpainting model and with Canny ControlNet got this error:
Error occurred when executing KSampler:

can't multiply sequence by non-int of type 'float'

File "/home/ubuntu/ComfyUI/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/home/ubuntu/ComfyUI/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/home/ubuntu/ComfyUI/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/home/ubuntu/ComfyUI/nodes.py", line 1265, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 176, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 310, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "/home/ubuntu/ComfyUI/comfy/samplers.py", line 742, in sample
samples = getattr(k_diffusion_sampling, "sample_{}".format(self.sampler))(self.model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar)
File "/home/ubuntu/ComfyUI/env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/ubuntu/ComfyUI/comfy/k_diffusion/sampling.py", line 707, in sample_dpmpp_sde_gpu
return sample_dpmpp_sde(model, x, sigmas, extra_args=extra_args, callback=callback, disable=disable, eta=eta, s_noise=s_noise, noise_sampler=noise_sampler, r=r)
File "/home/ubuntu/ComfyUI/env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/ubuntu/ComfyUI/comfy/k_diffusion/sampling.py", line 539, in sample_dpmpp_sde
denoised = model(x, sigmas[i] * s_in, **extra_args)
File "/home/ubuntu/ComfyUI/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/ubuntu/ComfyUI/comfy/samplers.py", line 323, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, cond_concat=cond_concat, model_options=model_options, seed=seed)
File "/home/ubuntu/ComfyUI/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/ubuntu/ComfyUI/comfy/k_diffusion/external.py", line 125, in forward
eps = self.get_eps(input * c_in, self.sigma_to_t(sigma), **kwargs)
File "/home/ubuntu/ComfyUI/comfy/k_diffusion/external.py", line 151, in get_eps
return self.inner_model.apply_model(*args, **kwargs)
File "/home/ubuntu/ComfyUI/comfy/samplers.py", line 311, in apply_model
out = sampling_function(self.inner_model.apply_model, x, timestep, uncond, cond, cond_scale, cond_concat, model_options=model_options, seed=seed)
File "/home/ubuntu/ComfyUI/comfy/samplers.py", line 289, in sampling_function
cond, uncond = calc_cond_uncond_batch(model_function, cond, uncond, x, timestep, max_total_area, cond_concat, model_options)
File "/home/ubuntu/ComfyUI/comfy/samplers.py", line 242, in calc_cond_uncond_batch
c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond))
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 435, in get_control
x *= self.strength
  1. When I changed the workflow and use an inpainting model and the same Canny ControlNet I get:
Error occurred when executing KSampler:

Error in function: AITemplateModelContainerSetManyConstants

File "/home/ubuntu/ComfyUI/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/home/ubuntu/ComfyUI/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/home/ubuntu/ComfyUI/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/home/ubuntu/ComfyUI/nodes.py", line 1265, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 176, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 274, in sample
AITemplate.unet[module['sha256']] = AITemplate.loader.apply_unet(
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/ait/load.py", line 222, in apply_unet
return self.apply(aitemplate_module, ait_params)
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/ait/load.py", line 201, in apply
aitemplate_module.set_many_constants_with_tensors(ait_params)
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/ait/module/model.py", line 840, in set_many_constants_with_tensors
self.set_many_constants(ait_tensors)
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/ait/module/model.py", line 791, in set_many_constants
self.DLL.AITemplateModelContainerSetManyConstants(
File "/home/ubuntu/ComfyUI/custom_nodes/AIT/AITemplate/ait/module/model.py", line 212, in _wrapped_func
raise RuntimeError(f"Error in function: {method.__name__}")

It works fine if I run it without ControlNet and with a non-inpainting model.

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant