-
Notifications
You must be signed in to change notification settings - Fork 12
Compiling modules on Windows? #11
Comments
@MrCheeze
There are also some other pending PRs:
In addition my local version also has a patch to mostly disable logging, disable profiling codegen to reduce output size, benchmark is enabled as it is useful, The reason for the logging patch is that the full path is displayed, this looks messy on the logging, and could inadvertently include the user's personal information I do not have a wheel for my current version, as I've made some of those changes directly to the installed version, however I've attached that copy, it can replace an existing installation Also attached are precompiled modules, they are for sm80, batch size 1 (technically 1 to 2 for classifier free guidance), I've opted for the .dll extension on the above PRs, the module selection code in this repo uses Note the compressed extension is expected VRAM usage according to AITemplate's memory planning
|
Fresh VAE modules, fp16, compatible with all SD VAE's that work in fp16, for XL use madebyollin/sdxl-vae-fp16-fix Usage
|
thanks! where do I put these? I tried to place them in |
I'm on linux ubuntu 22.04 and also would looooove to get AiTemplate support on SDXL. I downloaded the modules from here: https://huggingface.co/city96/AITemplate/tree/main/hlky |
User selection of modules was replaced with an automatic selection based on os, cuda arch, batch size and resolution, so the modules need to be uploaded to huggingface then added to modules.json. Until then you could add them to
Modules themselves go in the same folder as @CyberTimon Those particular modules are for Windows. I don't have WSL set up at the moment but I will compile linux XL modules soon. Just to note, cmake related things do not affect Linux. |
Thank you for your answer! |
yes. AIT modules are architecture specific, not checkpoint specific. (as far as I'm aware atleast) |
Wow! Didn't know that, very cool! I always had to recompile and redo all tensorrt model. |
Great thank you |
XL compilation works at the moment with that PR, and the others mentioned, 860 and 875. With Linux it is easier, check readme installation instructions, there is also a Docker image, although there are not many dependencies to install other than CUDA and the usual The PRs should be merged before installation/creating docker image, or changes can be made to an already installed package (e.g. The attached build of AITemplate is set up for Windows with Yes the modules are architecture specific not checkpoint specific. This plugin applies weights at runtime from the model loaded in the workflow. On Linux any size weights can be included in the module, on Windows including XL weights does not work due to the size, I think the limit is 2gb, so it works for v1/v2. |
oh, so the modules you linked here don't work on windows yet? I mean the DLL ones |
@Shaistrong The SDXL modules attached here with .dll extension, and the existing SDXL modules on huggingface are for Windows. |
oh, gotcha. so, when will modules.json be updated? or is there a version of the node that lets you select the module |
Hey @hlky, do I need to set this to false to only get these small files?? |
Its downloading the SDXL pipeline! Hope this works, would be soooo cool. |
nice! I already got the windows ones, but I can't get the custom node to see them |
@Shaistrong @CyberTimon Yes set |
Hey @hlky I'm getting this issue:
Do you need any more info about my system etc? I have a rtx 3090, ubuntu 22.04. NVCC output:
|
Nvidia SMI says |
@CyberTimon I think CUDA 11.5 is too old, the earliest mentioned on AITemplate repo is 11.6. You might have multiple versions installed and need to set some environment variables, but iirc nvidia-smi will display the I also see I used the Docker image before, I'd try that |
Thanks. I'm new to docker. I will try to compile it with docker. 👍 Thank you again. |
Also I try to upgrade my cuda version. Is 11.7 good or should I install 11.8? |
Anything past 11.6 should be good, I'd grab the latest for Ubuntu 22.04, I use 12.x for Windows.
For this part I think update |
How can I update gcc? I can't finy any info. I exectued sudo apt get install gcc/g++ 12 but I can't go past 12 because it doesn't find the package.
|
We are adding a lot of noise to this thread which is supposed to be about Windows. I'm not sure what's wrong with your environment and it's difficult to diagnose. I think it's best if you wait for me to compile Linux modules and then wait for them to be added to the plugin. I am beginning the process of compiling some SDXL Linux modules now. |
Oh great and yes you're right. Something is always wrong with my nvidia version.. haha. Will try to fix it myself. Thank you very much for helping me. Please ping me when you compiled some SDXL linux modules! :) Thanks |
when is FizzleDorf going to add the modules so that the node will detect them? |
sdxl modules now load from commit 942680d |
Is there any information on how to reproduce the windows-native modules you have provided at https://huggingface.co/Fizzledorf/AITemplate_V1_V2/tree/main/modules/windows ? From what I've seen, the AITemplate codebase seems very linux-specific, and I can't find any information anywhere on cross-compiling from Linux to Windows either (not even with WSL). How were those modules made?
The text was updated successfully, but these errors were encountered: