Skip to content

cuDNN Frontend error: No valid engine configs for Matmul_MUL_Reduction_SUB_EXP_Reduction_LOG_ADD_DIV_Matmul_ #9947

@zycore718

Description

@zycore718

1chara workflow.json

Custom Node Testing

Your question

Updated my system & GPU drivers this morning, it appears to have broke Comfy. KSampler now doesn't run with this error message:

cuDNN Frontend error: No valid engine configs for Matmul_MUL_Reduction_SUB_EXP_Reduction_LOG_ADD_DIV_Matmul_
{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":6}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":5}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":3}}

What happened? It's running on NVIDIA RTX 4060Ti 16GB, Linux Pop OS latest, driver version 580.82.07
CUDA version: 11.5.119
Pytorch version: 2.9.0.dev20250901+cu129

Logs

# ComfyUI Error Report
## Error Details
- **Node ID:** 10
- **Node Type:** KSampler (Efficient)
- **Exception Type:** RuntimeError
- **Exception Message:** cuDNN Frontend error: No valid engine configs for Matmul_MUL_Reduction_SUB_EXP_Reduction_LOG_ADD_DIV_Matmul_
{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":6}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":5}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":3}}



## Stack Trace

  File "/home/user1/ComfyUI/execution.py", line 496, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)

  File "/home/user1/ComfyUI/execution.py", line 315, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)

  File "/home/user1/ComfyUI/execution.py", line 289, in _async_map_node_over_list
    await process_inputs(input_dict, i)

  File "/home/user1/ComfyUI/execution.py", line 277, in process_inputs
    result = f(**inputs)

  File "/home/user1/ComfyUI/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 741, in sample
    samples, images, gifs, preview = process_latent_image(model, seed, steps, cfg, sampler_name, scheduler,

  File "/home/user1/ComfyUI/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 559, in process_latent_image
    samples = KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative,

  File "/home/user1/ComfyUI/nodes.py", line 1525, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)

  File "/home/user1/ComfyUI/nodes.py", line 1492, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,

  File "/home/user1/ComfyUI/comfy/sample.py", line 45, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 1161, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 1051, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 1036, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 1004, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 987, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 759, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context
    return func(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/k_diffusion/sampling.py", line 219, in sample_euler_ancestral
    denoised = model(x, sigmas[i] * s_in, **extra_args)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 408, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 960, in __call__
    return self.outer_predict_noise(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 967, in outer_predict_noise
    ).execute(x, timestep, model_options, seed)

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 970, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 388, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 206, in calc_cond_batch
    return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 214, in _calc_cond_batch_outer
    return executor.execute(model, conds, x_in, timestep, model_options)

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/samplers.py", line 333, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)

  File "/home/user1/ComfyUI/comfy/model_base.py", line 160, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/model_base.py", line 199, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)

  File "/home/user1/ComfyUI/custom_nodes/SeargeSDXL/modules/custom_sdxl_ksampler.py", line 71, in new_unet_forward
    x0 = old_unet_forward(self, x, timesteps, context, y, control, transformer_options, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 831, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(

  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 873, in _forward
    h = forward_timestep_embed(module, h, emb, context, transformer_options, time_context=time_context, num_video_frames=num_video_frames, image_only_indicator=image_only_indicator)

  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 44, in forward_timestep_embed
    x = layer(x, context, transformer_options)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 922, in forward
    x = block(x, context=context[i], transformer_options=transformer_options)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 808, in forward
    n = self.attn1(n, context=context_attn1, value=value_attn1, transformer_options=transformer_options)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 702, in forward
    out = optimized_attention(q, k, v, self.heads, attn_precision=self.attn_precision, transformer_options=transformer_options)

  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 130, in wrapper
    return func(*args, **kwargs)

  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 496, in attention_pytorch
    out = comfy.ops.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)

  File "/home/user1/ComfyUI/comfy/ops.py", line 47, in scaled_dot_product_attention
    return torch.nn.functional.scaled_dot_product_attention(q, k, v, *args, **kwargs)


## System Information
- **ComfyUI Version:** 0.3.59
- **Arguments:** main.py
- **OS:** posix
- **Python Version:** 3.10.12 (main, Aug 15 2025, 14:32:43) [GCC 11.4.0]
- **Embedded Python:** false
- **PyTorch Version:** 2.9.0.dev20250901+cu129
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 16698769408
  - **VRAM Free:** 15706357760
  - **Torch VRAM Total:** 0
  - **Torch VRAM Free:** 0

## Logs

2025-09-19T15:17:39.449188 - [START] Security scan2025-09-19T15:17:39.449196 - 
2025-09-19T15:17:40.154417 - [DONE] Security scan2025-09-19T15:17:40.154430 - 
2025-09-19T15:17:40.170994 - ## ComfyUI-Manager: installing dependencies done.2025-09-19T15:17:40.171009 - 
2025-09-19T15:17:40.171021 - ** ComfyUI startup time:2025-09-19T15:17:40.171028 -  2025-09-19T15:17:40.171035 - 2025-09-19 15:17:40.1712025-09-19T15:17:40.171042 - 
2025-09-19T15:17:40.171050 - ** Platform:2025-09-19T15:17:40.171056 -  2025-09-19T15:17:40.171063 - Linux2025-09-19T15:17:40.171069 - 
2025-09-19T15:17:40.171076 - ** Python version:2025-09-19T15:17:40.171083 -  2025-09-19T15:17:40.171089 - 3.10.12 (main, Aug 15 2025, 14:32:43) [GCC 11.4.0]2025-09-19T15:17:40.171095 - 
2025-09-19T15:17:40.171101 - ** Python executable:2025-09-19T15:17:40.171107 -  2025-09-19T15:17:40.171114 - /usr/bin/python32025-09-19T15:17:40.171122 - 
2025-09-19T15:17:40.171129 - ** ComfyUI Path:2025-09-19T15:17:40.171135 -  2025-09-19T15:17:40.171141 - /home/user1/ComfyUI2025-09-19T15:17:40.171147 - 
2025-09-19T15:17:40.171154 - ** ComfyUI Base Folder Path:2025-09-19T15:17:40.171160 -  2025-09-19T15:17:40.171167 - /home/user1/ComfyUI2025-09-19T15:17:40.171173 - 
2025-09-19T15:17:40.171180 - ** User directory:2025-09-19T15:17:40.171186 -  2025-09-19T15:17:40.171192 - /home/user1/ComfyUI/user2025-09-19T15:17:40.171198 - 
2025-09-19T15:17:40.171205 - ** ComfyUI-Manager config path:2025-09-19T15:17:40.171211 -  2025-09-19T15:17:40.171217 - /home/user1/ComfyUI/user/default/ComfyUI-Manager/config.ini2025-09-19T15:17:40.171223 - 
2025-09-19T15:17:40.171233 - ** Log path:2025-09-19T15:17:40.171239 -  2025-09-19T15:17:40.171246 - /home/user1/ComfyUI/user/comfyui.log2025-09-19T15:17:40.171252 - 
2025-09-19T15:17:40.854245 - 
Prestartup times for custom nodes:
2025-09-19T15:17:40.854319 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/rgthree-comfy
2025-09-19T15:17:40.854346 -    1.5 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-manager
2025-09-19T15:17:40.854367 - 
2025-09-19T15:17:41.440648 - Checkpoint files will always be loaded safely.
2025-09-19T15:17:41.620541 - Total VRAM 15925 MB, total RAM 31700 MB
2025-09-19T15:17:41.620606 - pytorch version: 2.9.0.dev20250901+cu129
2025-09-19T15:17:41.620759 - Set vram state to: NORMAL_VRAM
2025-09-19T15:17:41.620899 - Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync
2025-09-19T15:17:42.085830 - Using pytorch attention
2025-09-19T15:17:42.634244 - Python version: 3.10.12 (main, Aug 15 2025, 14:32:43) [GCC 11.4.0]
2025-09-19T15:17:42.634297 - ComfyUI version: 0.3.59
2025-09-19T15:17:42.636073 - ComfyUI frontend version: 1.26.13
2025-09-19T15:17:42.636483 - [Prompt Server] web root: /home/user1/.local/lib/python3.10/site-packages/comfyui_frontend_package/static
2025-09-19T15:17:42.710051 - /home/user1/.local/lib/python3.10/site-packages/kornia/feature/lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
  @torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
2025-09-19T15:17:43.129276 - Searge-SDXL v4.3.1 in /home/user1/ComfyUI/custom_nodes/SeargeSDXL2025-09-19T15:17:43.129296 - 
2025-09-19T15:17:44.520713 - �[34mWAS Node Suite: �[0mBlenderNeko's Advanced CLIP Text Encode found, attempting to enable `CLIPTextEncode` support.�[0m2025-09-19T15:17:44.520734 - 
2025-09-19T15:17:44.520763 - �[34mWAS Node Suite: �[0m`CLIPTextEncode (BlenderNeko Advanced + NSP)` node enabled under `WAS Suite/Conditioning` menu.�[0m2025-09-19T15:17:44.520773 - 
2025-09-19T15:17:44.879832 - �[34mWAS Node Suite: �[0mOpenCV Python FFMPEG support is enabled�[0m2025-09-19T15:17:44.879866 - 
2025-09-19T15:17:44.879905 - �[34mWAS Node Suite �[93mWarning: �[0m`ffmpeg_bin_path` is not set in `/home/user1/ComfyUI/custom_nodes/was-ns/was_suite_config.json` config file. Will attempt to use system ffmpeg binaries if available.�[0m2025-09-19T15:17:44.879922 - 
2025-09-19T15:17:45.244656 - �[34mWAS Node Suite: �[0mFinished.�[0m �[32mLoaded�[0m �[0m221�[0m �[32mnodes successfully.�[0m2025-09-19T15:17:45.244677 - 
2025-09-19T15:17:45.244692 - 
	�[3m�[93m"Be the change that you wish to see in the world."�[0m�[3m - Mahatma Gandhi�[0m
2025-09-19T15:17:45.244702 - 
2025-09-19T15:17:45.294015 - �[34m⚡ MNeMiC Nodes: �[92mLoaded�[0m2025-09-19T15:17:45.294034 - 
2025-09-19T15:17:45.309266 - 
2025-09-19T15:17:45.309288 - �[92m[rgthree-comfy] Loaded 48 epic nodes. 🎉�[0m2025-09-19T15:17:45.309298 - 
2025-09-19T15:17:45.309307 - 
2025-09-19T15:17:45.311244 - ### Loading: ComfyUI-Manager (V3.37)
2025-09-19T15:17:45.311445 - [ComfyUI-Manager] network_mode: public
2025-09-19T15:17:45.333767 - ### ComfyUI Version: v0.3.59-38-gdc95b6ac | Released on '2025-09-19'
2025-09-19T15:17:45.341809 - ### Loading: ComfyUI-Impact-Pack (V8.22.2)
2025-09-19T15:17:45.394976 - [Impact Pack] Wildcards loading done.
2025-09-19T15:17:45.396488 - �[36;20m[/home/user1/ComfyUI/custom_nodes/comfyui_controlnet_aux] | INFO -> Using ckpts path: /home/user1/ComfyUI/custom_nodes/comfyui_controlnet_aux/ckpts�[0m
2025-09-19T15:17:45.396584 - �[36;20m[/home/user1/ComfyUI/custom_nodes/comfyui_controlnet_aux] | INFO -> Using symlinks: False�[0m
2025-09-19T15:17:45.396672 - �[36;20m[/home/user1/ComfyUI/custom_nodes/comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']�[0m
2025-09-19T15:17:45.684656 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-09-19T15:17:45.696994 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-09-19T15:17:45.721731 - set VIDEO_TOTAL_PIXELS: 90316800
2025-09-19T15:17:45.722790 - TransformersProviderNode: /ComfyLLMToolkit/get_transformer_models endpoint registered
2025-09-19T15:17:45.727107 - LLM Toolkit Node Root identified as: /home/user1/ComfyUI/custom_nodes/llm-toolkit
2025-09-19T15:17:45.727407 - No .env file found in the custom node root (/home/user1/ComfyUI/custom_nodes/llm-toolkit/.env). Will rely on system environment variables if available.
2025-09-19T15:17:45.727577 - No .env file found in the custom node root (/home/user1/ComfyUI/custom_nodes/llm-toolkit/.env). Relying on system environment variables.
2025-09-19T15:17:45.727732 - Successfully imported node_helpers directly from ComfyUI
2025-09-19T15:17:45.727843 - APIProviderSelectorNode: Endpoints registered
2025-09-19T15:17:45.727948 - LLMToolkit: Registered POST route: /ComfyLLMToolkit/get_provider_models
2025-09-19T15:17:45.733296 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-09-19T15:17:45.739857 - Successfully imported functions from main llmtoolkit_utils.py
2025-09-19T15:17:45.739915 - LLMToolkit: POST route /ComfyLLMToolkit/get_provider_models already registered. Skipping.
2025-09-19T15:17:45.739946 - ComfyUI-LLM-Toolkit API routes checked/registered!
2025-09-19T15:17:45.742697 - ### Loading: ComfyUI-Impact-Subpack (V1.3.5)
2025-09-19T15:17:45.743119 - [Impact Pack/Subpack] Using folder_paths to determine whitelist path: /home/user1/ComfyUI/user/default/ComfyUI-Impact-Subpack/model-whitelist.txt
2025-09-19T15:17:45.743190 - [Impact Pack/Subpack] Ensured whitelist directory exists: /home/user1/ComfyUI/user/default/ComfyUI-Impact-Subpack
2025-09-19T15:17:45.743232 - [Impact Pack/Subpack] Loaded 0 model(s) from whitelist: /home/user1/ComfyUI/user/default/ComfyUI-Impact-Subpack/model-whitelist.txt
2025-09-19T15:17:45.815184 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-09-19T15:17:45.834024 - [Impact Subpack] ultralytics_bbox: /home/user1/ComfyUI/models/ultralytics/bbox
2025-09-19T15:17:45.834819 - [Impact Subpack] ultralytics_segm: /home/user1/ComfyUI/models/ultralytics/segm
2025-09-19T15:17:45.844028 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-09-19T15:17:45.855978 - 
�[36mEfficiency Nodes:�[0m Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...�[92mSuccess!�[0m2025-09-19T15:17:45.855994 - 
2025-09-19T15:17:45.857248 - 
Import times for custom nodes:
2025-09-19T15:17:45.857289 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/websocket_image_save.py
2025-09-19T15:17:45.857309 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfy-image-saver
2025-09-19T15:17:45.857326 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/ComfyUI_ADV_CLIP_emb
2025-09-19T15:17:45.857342 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/cg-use-everywhere
2025-09-19T15:17:45.857357 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/ComfyUI_ResolutionSelector
2025-09-19T15:17:45.857372 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-advanced-controlnet
2025-09-19T15:17:45.857387 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/rgthree-comfy
2025-09-19T15:17:45.857401 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfyui_essentials
2025-09-19T15:17:45.857415 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/wlsh_nodes
2025-09-19T15:17:45.857429 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfyui_ultimatesdupscale
2025-09-19T15:17:45.857443 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/efficiency-nodes-comfyui
2025-09-19T15:17:45.857460 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-manager
2025-09-19T15:17:45.857475 -    0.0 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-mnemic-nodes
2025-09-19T15:17:45.857493 -    0.1 seconds: /home/user1/ComfyUI/custom_nodes/SeargeSDXL
2025-09-19T15:17:45.857508 -    0.1 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-impact-pack
2025-09-19T15:17:45.857522 -    0.1 seconds: /home/user1/ComfyUI/custom_nodes/comfyui-impact-subpack
2025-09-19T15:17:45.857535 -    0.1 seconds: /home/user1/ComfyUI/custom_nodes/comfyui_controlnet_aux
2025-09-19T15:17:45.857549 -    0.2 seconds: /home/user1/ComfyUI/custom_nodes/llm-toolkit
2025-09-19T15:17:45.857563 -    2.1 seconds: /home/user1/ComfyUI/custom_nodes/was-ns
2025-09-19T15:17:45.857576 - 
2025-09-19T15:17:46.160126 - Context impl SQLiteImpl.
2025-09-19T15:17:46.160184 - Will assume non-transactional DDL.
2025-09-19T15:17:46.160619 - No target revision found.
2025-09-19T15:17:46.167151 - Starting server

2025-09-19T15:17:46.167276 - To see the GUI go to: http://127.0.0.1:8188
2025-09-19T15:17:50.551550 - FETCH ComfyRegistry Data: 5/982025-09-19T15:17:50.551579 - 
2025-09-19T15:17:55.340651 - Loaded 48 style names
2025-09-19T15:17:55.340851 - Loaded 44 banana task names
2025-09-19T15:17:55.341186 - Loaded 78 system prompt task names
2025-09-19T15:17:55.694763 - FETCH ComfyRegistry Data: 10/982025-09-19T15:17:55.694785 - 
2025-09-19T15:18:01.151207 - FETCH ComfyRegistry Data: 15/982025-09-19T15:18:01.151230 - 
2025-09-19T15:18:06.543974 - FETCH ComfyRegistry Data: 20/982025-09-19T15:18:06.544009 - 
2025-09-19T15:18:11.333074 - FETCH ComfyRegistry Data: 25/982025-09-19T15:18:11.333103 - 
2025-09-19T15:18:16.180753 - FETCH ComfyRegistry Data: 30/982025-09-19T15:18:16.180777 - 
2025-09-19T15:18:21.115144 - FETCH ComfyRegistry Data: 35/982025-09-19T15:18:21.115167 - 
2025-09-19T15:18:26.125856 - FETCH ComfyRegistry Data: 40/982025-09-19T15:18:26.125884 - 
2025-09-19T15:18:31.006738 - FETCH ComfyRegistry Data: 45/982025-09-19T15:18:31.006763 - 
2025-09-19T15:18:33.846008 - got prompt
2025-09-19T15:18:33.848376 - Failed to validate prompt for output 636:
2025-09-19T15:18:33.848416 - * (prompt):
2025-09-19T15:18:33.848437 -   - Required input is missing: images
2025-09-19T15:18:33.848457 - * Save Image w/Metadata 636:
2025-09-19T15:18:33.848478 -   - Required input is missing: images
2025-09-19T15:18:33.848497 - Output will be ignored
2025-09-19T15:18:33.848556 - Failed to validate prompt for output 25:
2025-09-19T15:18:33.848577 - * LatentFromBatch 16:
2025-09-19T15:18:33.848593 -   - Required input is missing: samples
2025-09-19T15:18:33.848610 - Output will be ignored
2025-09-19T15:18:33.848625 - Failed to validate prompt for output 18:
2025-09-19T15:18:33.848642 - Output will be ignored
2025-09-19T15:18:33.866734 - Using pytorch attention in VAE
2025-09-19T15:18:33.867461 - Using pytorch attention in VAE
2025-09-19T15:18:34.039658 - VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
2025-09-19T15:18:34.115686 - model weight dtype torch.float16, manual cast: None
2025-09-19T15:18:34.121844 - model_type EPS
2025-09-19T15:18:35.474124 - Using pytorch attention in VAE
2025-09-19T15:18:35.474681 - Using pytorch attention in VAE
2025-09-19T15:18:35.567102 - VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
2025-09-19T15:18:35.666657 - Requested to load SDXLClipModel
2025-09-19T15:18:35.673921 - loaded completely 9.5367431640625e+25 1560.802734375 True
2025-09-19T15:18:35.676842 - CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cuda:0, dtype: torch.float16
2025-09-19T15:18:35.906010 - Requested to load SDXLClipModel
2025-09-19T15:18:36.148554 - FETCH ComfyRegistry Data: 50/982025-09-19T15:18:36.148576 - 
2025-09-19T15:18:36.790860 - Requested to load SDXLClipModel
2025-09-19T15:18:37.505035 - loaded completely 13478.925 1560.802734375 True
2025-09-19T15:18:37.754639 - Requested to load SDXL
2025-09-19T15:18:38.635290 - loaded completely 11709.005147369386 4897.0483474731445 True
2025-09-19T15:18:38.680303 - 
  0%|                                                    | 0/28 [00:00<?, ?it/s]2025-09-19T15:18:38.961767 - 
  0%|                                                    | 0/28 [00:00<?, ?it/s]2025-09-19T15:18:38.961792 - 
2025-09-19T15:18:38.966701 - !!! Exception during processing !!! cuDNN Frontend error: No valid engine configs for Matmul_MUL_Reduction_SUB_EXP_Reduction_LOG_ADD_DIV_Matmul_
{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":6}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":5}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":3}}


2025-09-19T15:18:38.969102 - Traceback (most recent call last):
  File "/home/user1/ComfyUI/execution.py", line 496, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
  File "/home/user1/ComfyUI/execution.py", line 315, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
  File "/home/user1/ComfyUI/execution.py", line 289, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "/home/user1/ComfyUI/execution.py", line 277, in process_inputs
    result = f(**inputs)
  File "/home/user1/ComfyUI/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 741, in sample
    samples, images, gifs, preview = process_latent_image(model, seed, steps, cfg, sampler_name, scheduler,
  File "/home/user1/ComfyUI/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 559, in process_latent_image
    samples = KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative,
  File "/home/user1/ComfyUI/nodes.py", line 1525, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
  File "/home/user1/ComfyUI/nodes.py", line 1492, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
  File "/home/user1/ComfyUI/comfy/sample.py", line 45, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 1161, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 1051, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 1036, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 1004, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 987, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 759, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context
    return func(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/k_diffusion/sampling.py", line 219, in sample_euler_ancestral
    denoised = model(x, sigmas[i] * s_in, **extra_args)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 408, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 960, in __call__
    return self.outer_predict_noise(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 967, in outer_predict_noise
    ).execute(x, timestep, model_options, seed)
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 970, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 388, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 206, in calc_cond_batch
    return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 214, in _calc_cond_batch_outer
    return executor.execute(model, conds, x_in, timestep, model_options)
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/samplers.py", line 333, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
  File "/home/user1/ComfyUI/comfy/model_base.py", line 160, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/model_base.py", line 199, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user1/ComfyUI/custom_nodes/SeargeSDXL/modules/custom_sdxl_ksampler.py", line 71, in new_unet_forward
    x0 = old_unet_forward(self, x, timesteps, context, y, control, transformer_options, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 831, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
  File "/home/user1/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 873, in _forward
    h = forward_timestep_embed(module, h, emb, context, transformer_options, time_context=time_context, num_video_frames=num_video_frames, image_only_indicator=image_only_indicator)
  File "/home/user1/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 44, in forward_timestep_embed
    x = layer(x, context, transformer_options)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 922, in forward
    x = block(x, context=context[i], transformer_options=transformer_options)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 808, in forward
    n = self.attn1(n, context=context_attn1, value=value_attn1, transformer_options=transformer_options)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user1/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 702, in forward
    out = optimized_attention(q, k, v, self.heads, attn_precision=self.attn_precision, transformer_options=transformer_options)
  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 130, in wrapper
    return func(*args, **kwargs)
  File "/home/user1/ComfyUI/comfy/ldm/modules/attention.py", line 496, in attention_pytorch
    out = comfy.ops.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)
  File "/home/user1/ComfyUI/comfy/ops.py", line 47, in scaled_dot_product_attention
    return torch.nn.functional.scaled_dot_product_attention(q, k, v, *args, **kwargs)
RuntimeError: cuDNN Frontend error: No valid engine configs for Matmul_MUL_Reduction_SUB_EXP_Reduction_LOG_ADD_DIV_Matmul_
{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":6}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":5}}

{"engineId":1,"smVersion":890,"knobChoices":{"CUDNN_KNOB_TYPE_KERNEL_CFG":3}}



2025-09-19T15:18:38.969966 - Prompt executed in 5.12 seconds


## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

Workflow too large. Please manually upload the workflow from local file system.


## Additional Context
(Please add any additional context or steps to reproduce the error here)

Other

Metadata

Metadata

Assignees

No one assigned

    Labels

    User SupportA user needs help with something, probably not a bug.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions