Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ComfyUI #94

Open
wants to merge 16 commits into
base: master
Choose a base branch
from
Open

Add ComfyUI #94

wants to merge 16 commits into from

Conversation

lboklin
Copy link

@lboklin lboklin commented Apr 22, 2024

This is based on other people's work, most notably @fazo96's PR and @LoganBarnett's continuation of it.

Currently it is mostly an original implementation with a focus on making it easy to spin up a server without necessarily adding it to one's system config.

Remaining problems:

  • figure out what to do about all the other packages being broken from the updated nixpkgs input
  • rocm version appears to be broken since torchaudio was added as a dependency

Q&A

What’s the status of this PR and also what it brings over the work done on the nixpkgs PR?

It's fully functional. Perhaps most notably lacking in this PR is a NixOS module. This implementation is simply a package, and you configure it by overriding with your own settings, models, and custom nodes. Then you simply nix run that overridden package.

What this PR has over the nixpkgs PR is

  • a selection of packaged custom nodes and a super simple interface for adding any model you like to your own library
    • literally just "<modelDir>/<modelFile>" = {url = "..."; sha256 = "..."; authToken = "..."};, where authToken is optional and url can instead be either air or file. (BEWARE: authToken ends up as plain text in the nix store)
  • a ready-to-use Krita plugin server (three variants, in fact)
  • a template that should demonstrate how to use it with concrete and functional examples.

NixOS module?

I don't have any concrete plans to implement a NixOS module at the moment since that overlaps a lot with the nixpkgs PR, and personally I don't mind so much having to spin it up directly when I intend to generate images, because I only use it on my PC and not on a dedicated server.

How to use this?

All one has to do is to go to the PR's source repo and mentally substitute mentions of nixified-ai/flake with lboklin/nixified-ai and modify the template's input ref after initialising it. Let me know if the readme and template still leave questions unanswered.

projects/comfyui/default.nix Outdated Show resolved Hide resolved
@Airradda
Copy link

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting:

config.cudaSupport = true
Traceback (most recent call last):
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfyui", line 76, in <module>
    import execution
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/execution.py", line 11, in <module>
    import nodes
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/nodes.py", line 21, in <module>
    import comfy.diffusers_load
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/sd.py", line 5, in <module>
    from comfy import model_management
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/model_management.py", line 119, in <module>
    total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
                                  ^^^^^^^^^^^^^^^^^^
  File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/model_management.py", line 88, in get_torch_device
    return torch.device(torch.cuda.current_device())
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c8vn2rv4lv86sxp8p89vxx36pl8p0xcr-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 787, in current_device
    _lazy_init()
  File "/nix/store/c8vn2rv4lv86sxp8p89vxx36pl8p0xcr-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 302, in _lazy_init
    torch._C._cuda_init()
RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx
config.cudaSupport = false
Traceback (most recent call last):
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfyui", line 76, in <module>
      import execution
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/execution.py", line 11, in <module>
      import nodes
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/nodes.py", line 21, in <module>
      import comfy.diffusers_load
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/diffusers_load.py", line 3, in <module>
      import comfy.sd
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/sd.py", line 5, in <module>
      from comfy import model_management
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/model_management.py", line 119, in <module>
      total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
                                    ^^^^^^^^^^^^^^^^^^
    File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/model_management.py", line 88, in get_torch_device
      return torch.device(torch.cuda.current_device())
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/nix/store/m0lgwj7r0llsmcqqjfbswkja8n8n48hx-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 787, in current_device
      _lazy_init()
    File "/nix/store/m0lgwj7r0llsmcqqjfbswkja8n8n48hx-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 293, in _lazy_init
      raise AssertionError("Torch not compiled with CUDA enabled")
  AssertionError: Torch not compiled with CUDA enabled

nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

@lboklin
Copy link
Author

lboklin commented Apr 23, 2024

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting:
config.cudaSupport = true
config.cudaSupport = false

nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

Does config.cudaSupport = true give you trouble when you nix run .#comfyui-amd?

Anyhow I've removed it because I figured out the reason I had to add it.

Thanks for (presumably) testing with an AMD card.

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

I don't know why that would happen; it works on my end. What is your environment? NixOS?

@Airradda
Copy link

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting:
config.cudaSupport = true
config.cudaSupport = false
nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

Does config.cudaSupport = true give you trouble when you nix run .#comfyui-amd?

Yes, the resulting error was in the config.cudaSupport = true details.

Thanks for (presumably) testing with an AMD card.

Yes, this is being tested on a 6950 XT.

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

I don't know why that would happen; it works on my end. What is your environment? NixOS?

Yes, I'm on NixOS Unstable.


Not caused by this PR or Nixified-AI, but I am currently failing to build roctracer for both ROCM 5 and 6, so I can't test anymore until I find out why:

roctracer
@nix { "action": "setPhase", "phase": "unpackPhase" }
Running phase: unpackPhase
unpacking source archive /nix/store/zvww1d6zkf2gnva6j2ccd82axj075l4s-source
source root is source
@nix { "action": "setPhase", "phase": "patchPhase" }
Running phase: patchPhase
substituteStream(): WARNING: '--replace' is deprecated, use --replace-{fail,warn,quiet}. (file 'CMakeLists.txt')
@nix { "action": "setPhase", "phase": "updateAutotoolsGnuConfigScriptsPhase" }
Running phase: updateAutotoolsGnuConfigScriptsPhase
@nix { "action": "setPhase", "phase": "configurePhase" }
Running phase: configurePhase
fixing cmake files...
cmake flags: -DCMAKE_FIND_USE_SYSTEM_PACKAGE_REGISTRY=OFF -DCMAKE_FIND_USE_PACKAGE_REGISTRY=OFF -DCMAKE_EXPORT_NO_PACKAGE_REGISTRY=ON -DCMAKE_BUILD_TYPE=Release -DBUILD_TESTING=OFF -DCMAKE_INSTALL_LOCALEDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/share/locale -DCMAKE_INSTALL_LIBEXECDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/libexec -DCMAKE_INSTALL_LIBDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/lib -DCMAKE_INSTALL_DOCDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/share/doc/roctrace>
-- The C compiler identification is GNU 12.3.0
-- The CXX compiler identification is GNU 12.3.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
CMake Deprecation Warning at /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config.cmake:20 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.
Call Stack (most recent call first):
  CMakeLists.txt:49 (find_package)


CMake Deprecation Warning at /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config-amd.cmake:21 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.
Call Stack (most recent call first):
  /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config.cmake:150 (include)
  CMakeLists.txt:49 (find_package)


-- hip::amdhip64 is SHARED_LIBRARY
-- /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/g++: CLANGRT compiler options not supported.
-- Found Python3: /nix/store/glfr70gi7hfaj50mwj2431p8bg60fhqw-python3-3.11.9/bin/python3.11 (found version "3.11.9") found components: Interpreter
CMake Error at src/CMakeLists.txt:77 (find_file):
  Could not find HIP_RUNTIME_API_H using the following files:
  hip_runtime_api.h

-- Configuring incomplete, errors occurred!

@LoganBarnett
Copy link

Thanks for the call out and your work on this!

I took the liberty of copying your added custom-nodes to my nixpkgs comfyui fork. While it's all still a draft I'm working at moving all of it out of my dotfiles. I await a reply from the original author for write permission to the branch.

@lboklin
Copy link
Author

lboklin commented Apr 26, 2024

I'm experimenting with parametrising the whole flake over a configurable set of options by allowing the user to override a flake (nixified-cfg) with one of their own which holds their configuration along with a library of models.

This approach is currently the only way (as far as I can tell) to parametrise a flake (supply a flake with arguments, cf. NixOS/nix#5663). It's a little clunky, but it's declarative, allowing one to manage one's own models and custom nodes (TBI) with their own personal flake, which seems like a neat way to organise ones configurations in any case.

Edit: forgot to link the default (for now) nixified-cfg: https://github.com/lboklin/nixified-cfg. Its readme has basic instructions, but basically you clone it and pass it in with --override-input nixified-cfg <cloned-cfg> when running comfyui.

@Airradda
Copy link

I've gotten all the ROCM stuff to build, so now I'm on to the actual Comfy-UI stuff. It seems to be trying to access a non-existent /var/lib/comfy-ui/user directory. It also appears to not be one of the directories configurable via flags (See Log 2). I'll do a more in depth run after I get back from work.

Log
Total VRAM 16368 MB, total RAM 32020 MB
Set vram state to: NORMAL_VRAM
Device: cuda:0 AMD Radeon RX 6950 XT : native
VAE dtype: torch.float32
Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
Setting temp directory to: /var/lib/comfyui/temp/temp
Traceback (most recent call last):
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/comfyui", line 206, in <module>
    server = server.PromptServer(loop)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/server.py", line 70, in __init__
    self.user_manager = UserManager()
                        ^^^^^^^^^^^^^
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/app/user_manager.py", line 20, in __init__
    os.mkdir(user_directory)
FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/comfyui/user'
Log 2

nix run .#comfyui-amd -- --input-directory /home/airradda/Documents/Comfy-UI/Inputs --output-directory /home/airradda/Documents/Comfy-UI/Output --temp-directory /home/airradda/Documents/Comfy-UI/Temp

Total VRAM 16368 MB, total RAM 32020 MB
Set vram state to: NORMAL_VRAM
Device: cuda:0 AMD Radeon RX 6950 XT : native
VAE dtype: torch.float32
Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
Setting temp directory to: /home/airradda/Documents/Comfy-UI/Temp/temp
Traceback (most recent call last):
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/comfyui", line 206, in <module>
    server = server.PromptServer(loop)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/server.py", line 70, in __init__
    self.user_manager = UserManager()
                        ^^^^^^^^^^^^^
  File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/app/user_manager.py", line 20, in __init__
    os.mkdir(user_directory)
FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/comfyui/user'

@lboklin
Copy link
Author

lboklin commented Apr 26, 2024

I've gotten all the ROCM stuff to build, so now I'm on to the actual Comfy-UI stuff. It seems to be trying to access a non-existent /var/lib/comfy-ui/user directory. It also appears to not be one of the directories configurable via flags (See Log 2). I'll do a more in depth run after I get back from work.
Log
Log 2

I think the problem makes sense because /var/lib is mutable and shouldn't be accessed during build time. I'm working on model collections so that you basically build a collection of everything in the config flake and then comfyui points to that collection in its extra_model_paths.yaml

@lboklin
Copy link
Author

lboklin commented Apr 26, 2024

Just a heads-up: I'm restructuring a bunch and things are quite broken atm.

@lboklin
Copy link
Author

lboklin commented Apr 26, 2024

As of this moment, this should work (for nvidia) and give a minimal setup: nix run github:lboklin/nixified-ai#comfyui-nvidia --override-input nixified-cfg github:lboklin/nixified-cfg/eb69f4c62fa0ce19eee2e8a4a5d601176a398bfd (most recent commit on comfyui-minimal branch as of now).

@Airradda
Copy link

I swapped the hardcoded paths in the package.nix for some user accessible ones in ~/Documents/Comfy-UI and, using your input override, I have successfully generated an image. This includes full use of the GPU during generation.

@lboklin
Copy link
Author

lboklin commented Apr 27, 2024

I swapped the hardcoded paths in the package.nix for some user accessible ones in ~/Documents/Comfy-UI and, using your input override, I have successfully generated an image. This includes full use of the GPU during generation.

Do you mean the subdirectories of the models? I suppose those ought to be configurable as well. Hadn't thought of that!

@Airradda
Copy link

I tried get a readable diff to show, but I hardcoded the config-data.comfyui.base_path and the inputPath, outputPath, tempPath, and userPath.

projects/comfyui/package.nix --- 1/4 --- Nix
7 , stdenv
8 , symlinkJoin
9 , config
10 , modelsPath ? "/home/airradda/Documents/Comfy-UI/Models"
11 , inputPath ? "/home/airradda/Documents/Comfy-UI/Input"
12 , outputPath ? "/home/airradda/Documents/Comfy-UI/Output"
13 , tempPath ? "/home/airradda/Documents/Comfy-UI/Temp"
14 , userPath ? "/home/airradda/Documents/Comfy-UI/User"
15 , customNodes
16 , models
17 }:

projects/comfyui/package.nix --- 2/4 --- Nix
20 20
21 config-data = { 21 config-data = {
22 comfyui = { 22 comfyui = {
23 base_path = modelsPath; 23 base_path = "/home/airradda/Documents/Comfy-UI/Models";
24 checkpoints = "${modelsPath}/checkpoints"; 24 checkpoints = "${modelsPath}/checkpoints";
25 clip = "${modelsPath}/clip"; 25 clip = "${modelsPath}/clip";
26 clip_vision = "${modelsPath}/clip_vision"; 26 clip_vision = "${modelsPath}/clip_vision";

projects/comfyui/package.nix --- 3/4 --- Nix
58 tqdm 58 tqdm
59 ] ++ (builtins.concatMap (node: node.dependencies) customNodes))); 59 ] ++ (builtins.concatMap (node: node.dependencies) customNodes)));
60 60
61 executable = writers.writeDashBin "comfyui" '' 61 executable = writers.writeDashBin "comfyui" ''
62 cd $out && \ 62 cd $out &&
63 ${pythonEnv}/bin/python comfyui \ 63 ${pythonEnv}/bin/python comfyui
64 --input-directory ${inputPath} \ 64 --input-directory "/home/airradda/Documents/Comfy-UI/Input"
65 --output-directory ${outputPath} \ 65 --output-directory "/home/airradda/Documents/Comfy-UI/Output"
66 --extra-model-paths-config ${modelPathsFile} \ 66 --extra-model-paths-config ${modelPathsFile}
67 --temp-directory ${tempPath} \ 67 --temp-directory "/home/airradda/Documents/Comfy-UI/Temp"
68 "$@" 68 "$@"
69 ''; 69 '';

@lboklin
Copy link
Author

lboklin commented Apr 27, 2024

@Airradda you can generate a diff with git diff --patch

Edit: also, looks like you are on a slightly older commit (unless you added the defaults yourself - I recently removed them)

@lboklin
Copy link
Author

lboklin commented Apr 27, 2024

So the idea with the declarative model management is that you customise your set of models in the cfg flake, here. You can of course modify the nixified-ai like you did and manage models yourself, mutably; but in this PR my goal is to make that unnecessary because all of that would be handled the nix way, and it should be easy to add what you need right in one's own cfg flake.

@lboklin
Copy link
Author

lboklin commented Apr 27, 2024

I hope I didn't lie by stating in the comments at the top of the file that downloads are cached even if the checksum fails to match. It seemed like it was in one case but in another it seemed to redownload the whole thing once I added the correct one. That is of course less than ideal because some models can be very large, and you really have no good way (afaik) of getting the checksum before deliberately using an incorrect one.

@Airradda
Copy link

@Airradda you can generate a diff with git diff --patch

Edit: also, looks like you are on a slightly older commit (unless you added the defaults yourself - I recently removed them)

That is what I posted came from.

The commit is from last night. I explicitly added them back in before hardcoding them as the default weren't applying.

@Airradda
Copy link

So the idea with the declarative model management is that you customise your set of models in the cfg flake, here. You can of course modify the nixified-ai like you did and manage models yourself, mutably; but in this PR my goal is to make that unnecessary because all of that would be handled the nix way, and it should be easy to add what you need right in one's own cfg flake.

I was also trying to get a minimum setup going before messing with the cfg/nix stuff. The setup I had before, running in a rocm-pytorch container with podman, broke so I was trying to have minimal downtime.

Tonight, I will probably start messing with and/or move to the cfg based setup and see how that goes.

@lboklin
Copy link
Author

lboklin commented Apr 29, 2024

I've progressed on implementing custom nodes management in addition to models, but there is a hurdle: custom nodes can't have their own dependencies and I don't know how to solve the problem cleanly.

Anyway, I've been trying to make a config that has everything the Krita AI plugin needs, and all requirements are met except one (controlnet_aux) due to dependencies. If one is eager, they can be added manually to the comfyui package.

I'm doing this on a separate branch because there is a minor change in the config "api" that I just haven't synced across both projects yet outside of these two branches (https://github.com/lboklin/nixified-ai/tree/comfyui-krita-requirements and https://github.com/lboklin/nixified-cfg/tree/comfyui-krita-requirements).

@LoganBarnett
Copy link

@lboklin I'm not behind my computer (to provide links) but I've been doing custom nodes in my dotfiles, and I've been doing dependencies in there. These dependencies are bundled up to become ComfyUI's dependencies. Does that solve your issue?

My nixpkgs branch should also be demonstrating this.

@lboklin
Copy link
Author

lboklin commented Apr 29, 2024

@LoganBarnet I'm also not at the computer atm, but I was using your nixpkgs fork as reference. If I try to do it the same way I get "... is a string with context when a set was expected". One of your comments mentioned this problem with linkFarm, so maybe an alternative to that could be concocted, but my intuition is that the problem is greater than that. I'll have a look at it again tomorrow.

@lboklin
Copy link
Author

lboklin commented Apr 30, 2024

Alright, I solved the problem with custom node dependencies. For one, I had to set the derivations' passthru.dependencies, not merely dependencies (they were never preserved); but that they weren't propagated was occluded by the previously mentioned error which was caused by cfg.models and cfg.customNodes in the configuration flake being set to strings with the outPath of the respective packages rather than their derivations.

So both declarative model management and custom nodes seem to work now. Something is missing for the Krita plugin still so I'm looking into that now, and after that I'll redirect my attention to the NixOS module.

@lboklin
Copy link
Author

lboklin commented May 6, 2024

Based on feedback I've moved everything back into this main repo and added a basic package for running a krita-ai server (packages.krita-comfyui-server-"${gpuVendor}" / legacyPackages.comfyui."${gpuVendor}".kritaServer) as well as some general functions in legacyPackages.comfyui."${gpuVendor}":

  • withConfig with type
    • { models : Models,
      , customNodes : CustomNodes
      , inputPath : String
      , outputPath : String
      , tempPath : String
      , userPath : String
      , ...
      }
      
  • withPlugins with type (Models -> Models) -> (CustomNodes -> CustomNodes) -> derivation
    • implies default configuration, which sets all paths to /var/lib/comfyui and there within.

Example usage of withPlugins from the commandline: nix build --impure --expr '(builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.nvidia.withPlugins (ms: { checkpoints = { inherit (ms.checkpoints) DreamShaper_8_pruned; }; }) (ns: { inherit (ns) controlnet-aux; })'

@Airradda
Copy link

Airradda commented May 6, 2024

I have built and generated an image using

nix build --impure --expr '(builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.amd.withPlugins (ms: { checkpoints = { inherit (ms.checkpoints) DreamShaper_8_pruned; }; }) (ns: { inherit (ns) controlnet-aux; })'.

and I can confirm it is properly making full use of my AMD GPU.

@lboklin
Copy link
Author

lboklin commented May 9, 2024

I've added a bunch of models and changed the outputs a bit.

Examples of how one could use the new outputs:

# run a server with absolutely all models and custom nodes:
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins (plugins: plugins)'
# all the checkpoint models but no custom nodes
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })'
# run a krita ai server with all optional models included (controlnets and such):
nix run github:lboklin/nixified-ai#krita-comfyui-server-{nvidia,amd}
# run a minimal krita ai server with a custom model set (but please actually put the expression in a file):
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; kritaServerWithModels (ms: with ms; { checkpoints = { inherit (checkpoints) colossus-xl-v6; }; ipadapter = { inherit (ipadapter) ip-adapter-faceid-plusv2_sdxl; }; loras = { inherit (loras) ip-adapter-faceid-plusv2_sdxl_lora; }; })'

It seems to me like it's not uncommon for custom nodes to require certain models, so I added a passthru for that as well.

reactor-node is broken because it tries to write to the models dir, but I'm leaving it there.

@Airradda
Copy link

Airradda commented May 9, 2024

nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })' is the only functioning command for me. The others error out because of albumentations, which produces the following error log. Aside from that, I am able to generate an image and will start trying migrating from my current nixifiedai-cfg setup see if I come across anything else.

Error Log
error: builder for '/nix/store/dc59xb4vh3x8ihxqpbnmk3d7l6y5nviv-python3.11-albumentations-1.4.2.drv' failed with exit code 139;
       last 10 log lines:
       >   File "/nix/store/yqcv2gvxqjw25ypg2vaixdfl6qcsgpna-python3.11-pluggy-1.4.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 501 in __call__
       >   File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/_pytest/config/__init__.py", line 175 in main
       >   File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/_pytest/config/__init__.py", line 198 in console_main
       >   File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/pytest/__main__.py", line 7 in <module>
       >   File "<frozen runpy>", line 88 in _run_code
       >   File "<frozen runpy>", line 198 in _run_module_as_main
       >
       > Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, cv2, skimage._shared.geometry, yaml._yaml, scipy._lib._ccallback_c, scipy.ndimage._nd_image, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._decomp_lu_cython, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.special._ellip_harm_2, _ni_label, scipy.ndimage._ni_label, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.spatial.transform._rotation, sklearn.__check_build._check_build, lz4._version, lz4.frame._frame, psutil._psutil_linux, psutil._psutil_posix, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._ansari_swilk_statistics, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.stats._unuran.unuran_wrapper, sklearn.utils._isfinite, sklearn.utils.murmurhash, sklearn.utils._openmp_helpers, sklearn.utils.sparsefuncs_fast, sklearn.utils._random, sklearn.utils._seq_dataset, sklearn.metrics.cluster._expected_mutual_info_fast, sklearn.preprocessing._csr_polynomial_expansion, sklearn.preprocessing._target_encoder_fast, sklearn.metrics._dist_metrics, sklearn.metrics._pairwise_distances_reduction._datasets_pair, sklearn.utils._cython_blas, sklearn.metrics._pairwise_distances_reduction._base, sklearn.metrics._pairwise_distances_reduction._middle_term_computer, sklearn.utils._heap, sklearn.utils._sorting, sklearn.metrics._pairwise_distances_reduction._argkmin, sklearn.metrics._pairwise_distances_reduction._argkmin_classmode, sklearn.utils._vector_sentinel, sklearn.metrics._pairwise_distances_reduction._radius_neighbors, sklearn.metrics._pairwise_distances_reduction._radius_neighbors_classmode, sklearn.metrics._pairwise_fast, sklearn.linear_model._cd_fast, sklearn._loss._loss, sklearn.utils.arrayfuncs, sklearn.svm._liblinear, sklearn.svm._libsvm, sklearn.svm._libsvm_sparse, sklearn.utils._weight_vector, sklearn.linear_model._sgd_fast, sklearn.linear_model._sag_fast, sklearn.decomposition._online_lda_fast, sklearn.decomposition._cdnmf_fast, skimage.measure._ccomp (total: 155)
       > /nix/store/46c2xhjgxmvdxk69rpdxdkxz3c3dshdi-pytest-check-hook/nix-support/setup-hook: line 53:   401 Segmentation fault      (core dumped) /nix/store/gd3shnza1i50zn8zs04fa729ribr88m9-python3-3.11.8/bin/python3.11 -m pytest -k "not test_transforms"
       > /nix/store/v5lsd029lz5lfhamivbgqyp3zdv94ah2-stdenv-linux/setup: line 1578: pop_var_context: head of shell_variables not a function context
       For full logs, run 'nix log /nix/store/dc59xb4vh3x8ihxqpbnmk3d7l6y5nviv-python3.11-albumentations-1.4.2.drv'.
error: 1 dependencies of derivation '/nix/store/yjkzz7m7rzfg33n8hak2cx7vbns373fn-python3-3.11.8-env.drv' failed to build
error: 1 dependencies of derivation '/nix/store/ing7sbwax4f25i02wpqcdsfzg4iadr25-comfyui.drv' failed to build
error: 1 dependencies of derivation '/nix/store/g4i137cgzbkdyz0a02i38avcz7b5sjyb-comfyui-unstable-2024-04-15.drv' failed to build

@lboklin lboklin marked this pull request as ready for review August 12, 2024 12:31
@lboklin
Copy link
Author

lboklin commented Aug 12, 2024

I think the draft status made this seem less complete than it is, so I undrafted it even if there are a couple of remaining problems.

@Azeirah I pushed an unsquashed branch that excludes the recent update to comfyui in case you want to use an older version that still works. I'm not sure if flux works on that older version though.

@Azeirah
Copy link

Azeirah commented Aug 12, 2024

I think the draft status made this seem less complete than it is, so I undrafted it even if there are a couple of remaining problems.

@Azeirah I pushed an unsquashed branch that excludes the recent update to comfyui in case you want to use an older version that still works. I'm not sure if flux works on that older version though.

I managed to get the newest comfyui to run by creating an empty override for torchaudio, so instead of trying to build torchaudio incorrectly, it skips it entirely. Comfyui does run, it's not a hard dependency, but you do get a warning on startup.

Flux has only been supported since the day of release of the flux model itself. The previous version was in June IIRC, and flux is definitely not supported in that version.|

I really hope to learn more about nix so I can fix the rocm situation for myself. I spent like 8 hours past sunday trying to get it to work 😅

@MatthewCroughan
Copy link
Member

llama.cpp manages to use vulkan to support working on AMD and Nvidia GPUs together, without library compatibility issues. Is there not a similar thing going on in the diffusion ecosystem?

ZLUDA was also taken down recently, which is a big blow to compatibility efforts https://www.phoronix.com/news/AMD-ZLUDA-CUDA-Taken-Down

@LoganBarnett
Copy link

@lboklin FWIW I did steal some of your Krita related work and made a custom node bundle for it here: https://github.com/NixOS/nixpkgs/pull/268378/files#diff-d336deacc925ae350d52ed5210be2622aa537cb840b743f6852d9d20299b1f0c

Though I haven't done a lot to keep up with your work. I'd like to take a look again when $TIME becomes sufficiently high enough for me to do handle the remaining PR feedback I have.

@lboklin
Copy link
Author

lboklin commented Sep 1, 2024

@lboklin FWIW I did steal some of your Krita related work and made a custom node bundle for it here: https://github.com/NixOS/nixpkgs/pull/268378/files#diff-d336deacc925ae350d52ed5210be2622aa537cb840b743f6852d9d20299b1f0c

Though I haven't done a lot to keep up with your work. I'd like to take a look again when $TIME becomes sufficiently high enough for me to do handle the remaining PR feedback I have.

You can copy all you want! Being a maintainer is not something I wish to inflict on myself, so the best case scenario is if (parts of) my work can be of use without my name ending up in the maintainers list 😅

@lboklin
Copy link
Author

lboklin commented Sep 3, 2024

Just to give a heads up: I'm considering throwing out a lot of models in favour of making it easier to add your own. I'm even implementing utility functions for generating model libraries directly from calls to (currently only) civitai's api, so that
you can generate a big model set simply with a command like this:

nix run --show-trace --impure --expr '
  (builtins.getFlake "'${PWD}'").legacyPackages.x86_64-linuxpkgs.genCivitaiModels {
    authToken = "<token>";
    params = {
      sort = "Highest Rated"
      types = ["LORA" "Checkpoint"];
      baseModels = ["Flux.1 S" "Flux.1 D"];
      fileFormats = ["Diffusers" "SafeTensor"];
    };
  }' \
  > flux-models.nix \
  && nix fmt flux-models.nix &>/dev/null

There are just too many models out there, so it doesn't make much sense to keep throwing them into this repo beyond those required for the krita plugin and perhaps some very common ones.

The heads up part of this is that this will obviously make previously available models unavailable for those relying on them. I'm also foregoing the format of naming every single model in the set in favour of using attribute names for installPath, since that will fill the function of both identifying a model and also where it will go. After all, they're not a package set, but a collection of installable models.

@lboklin
Copy link
Author

lboklin commented Sep 12, 2024

I tried to make it as pain-free as possible to adapt to the new model installation format. Let me know if there are any problems.

In addition to the commit notes, comfyui and several nodes were updated to more recent versions.

…anges:

- remove the accumulated general library of models
- change how models are declared/installed (with temporary back-compatibility)
- phase out fetchFromHuggingFace (not very useful util)
- add support for the AIR spec
- add comfyui types and type utils
- improve type checking, errors, warnings, and comments
@lboklin
Copy link
Author

lboklin commented Sep 12, 2024

I left out the model set generator because it's quite orthogonal to this PR specifically and perhaps more suited to a separate repo entirely, and honestly the utility is marginal unless someone enjoys the idea of generating an absolutely massive set of rubbish models and then import the few that may be of interest. I spent more time on it than I should have, I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants