Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't start processing scripts (export_onnx.py and export_trt.py) #15

Open
Maelstrom2014 opened this issue Jun 17, 2024 · 9 comments
Open

Comments

@Maelstrom2014
Copy link

Hi! How to get this working on windows?

c:\ai\comfyui>.\python_embeded\python.exe c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_onnx.py
Total VRAM 16379 MB, total RAM 49134 MB
pytorch version: 2.3.0+cu121
xformers version: 0.0.26.post1
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : native
VAE dtype: torch.bfloat16
WARNING: comfy_extras.chainner_models is deprecated and has been replaced by the spandrel library.
Traceback (most recent call last):
  File "c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_onnx.py", line 113, in <module>
    torch.onnx.export(upscale_model,
  File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 516, in export
    _export(
  File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 1589, in _export
    with exporter_context(model, training, verbose):
  File "contextlib.py", line 137, in __enter__
  File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 179, in exporter_context
    with select_model_mode_for_export(
  File "contextlib.py", line 137, in __enter__
  File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 140, in disable_apex_o2_state_dict_hook
    for module in model.modules():
                  ^^^^^^^^^^^^^
AttributeError: 'ImageModelDescriptor' object has no attribute 'modules'

and this

c:\ai\comfyui>.\python_embeded\python.exe c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_trt.py
Traceback (most recent call last):
  File "c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_trt.py", line 3, in <module>
    from utilities import Engine
ModuleNotFoundError: No module named 'utilities'
@yuvraj108c
Copy link
Owner

Might be related: #11

@Maelstrom2014
Copy link
Author

I found a soulution (another)
put in the begining if export_trt.py:

import os, sys
file_dir = os.path.dirname(__file__)
sys.path.append(file_dir)

@yuvraj108c
Copy link
Owner

Nice!

@Maelstrom2014
Copy link
Author

Nice!

export onnx still not working.

@DarkAlchy
Copy link

Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.

I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.

@yuvraj108c
Copy link
Owner

Nice!

export onnx still not working.

I don't recommend exporting onnx models, because most models won't work and it'll require high ram, but the instructions are on the export_onnx.py

@yuvraj108c
Copy link
Owner

Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.

I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.

you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version

@Maelstrom2014
Copy link
Author

Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.

you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version

I need to convert another .pth file.

@DarkAlchy
Copy link

DarkAlchy commented Jun 18, 2024

Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.

you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version

Incorrect as I did that twice and that same error came up.

edit: I am just going to start over from scratch as that yolo world Zho Zho Zho is a know bad node only I did not know and it zapped me.

Now I do have a ton (7 gigs) of .pth files I would like to convert that should work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants