-
-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ComfyUI #94
base: master
Are you sure you want to change the base?
Add ComfyUI #94
Conversation
I'll investigate more when I get back from work. but when trying to config.cudaSupport = true
config.cudaSupport = false
|
Does Anyhow I've removed it because I figured out the reason I had to add it. Thanks for (presumably) testing with an AMD card.
I don't know why that would happen; it works on my end. What is your environment? NixOS? |
Yes, the resulting error was in the
Yes, this is being tested on a 6950 XT.
Yes, I'm on NixOS Unstable. Not caused by this PR or Nixified-AI, but I am currently failing to build roctracer
|
Thanks for the call out and your work on this! I took the liberty of copying your added custom-nodes to my nixpkgs comfyui fork. While it's all still a draft I'm working at moving all of it out of my dotfiles. I await a reply from the original author for write permission to the branch. |
I'm experimenting with parametrising the whole flake over a configurable set of options by allowing the user to override a flake (nixified-cfg) with one of their own which holds their configuration along with a library of models. This approach is currently the only way (as far as I can tell) to parametrise a flake (supply a flake with arguments, cf. NixOS/nix#5663). It's a little clunky, but it's declarative, allowing one to manage one's own models and custom nodes (TBI) with their own personal flake, which seems like a neat way to organise ones configurations in any case. Edit: forgot to link the default (for now) nixified-cfg: https://github.com/lboklin/nixified-cfg. Its readme has basic instructions, but basically you clone it and pass it in with |
I've gotten all the ROCM stuff to build, so now I'm on to the actual Comfy-UI stuff. It seems to be trying to access a non-existent Log
Log 2
|
I think the problem makes sense because /var/lib is mutable and shouldn't be accessed during build time. I'm working on model collections so that you basically build a collection of everything in the config flake and then comfyui points to that collection in its |
Just a heads-up: I'm restructuring a bunch and things are quite broken atm. |
As of this moment, this should work (for nvidia) and give a minimal setup: |
I swapped the hardcoded paths in the package.nix for some user accessible ones in |
Do you mean the subdirectories of the models? I suppose those ought to be configurable as well. Hadn't thought of that! |
I tried get a readable diff to show, but I hardcoded the projects/comfyui/package.nix --- 1/4 --- Nix projects/comfyui/package.nix --- 2/4 --- Nix projects/comfyui/package.nix --- 3/4 --- Nix
|
@Airradda you can generate a diff with Edit: also, looks like you are on a slightly older commit (unless you added the defaults yourself - I recently removed them) |
So the idea with the declarative model management is that you customise your set of models in the cfg flake, here. You can of course modify the nixified-ai like you did and manage models yourself, mutably; but in this PR my goal is to make that unnecessary because all of that would be handled the nix way, and it should be easy to add what you need right in one's own cfg flake. |
I hope I didn't lie by stating in the comments at the top of the file that downloads are cached even if the checksum fails to match. It seemed like it was in one case but in another it seemed to redownload the whole thing once I added the correct one. That is of course less than ideal because some models can be very large, and you really have no good way (afaik) of getting the checksum before deliberately using an incorrect one. |
That is what I posted came from. The commit is from last night. I explicitly added them back in before hardcoding them as the default weren't applying. |
I was also trying to get a minimum setup going before messing with the cfg/nix stuff. The setup I had before, running in a rocm-pytorch container with podman, broke so I was trying to have minimal downtime. Tonight, I will probably start messing with and/or move to the cfg based setup and see how that goes. |
I've progressed on implementing custom nodes management in addition to models, but there is a hurdle: custom nodes can't have their own dependencies and I don't know how to solve the problem cleanly. Anyway, I've been trying to make a config that has everything the Krita AI plugin needs, and all requirements are met except one (controlnet_aux) due to dependencies. If one is eager, they can be added manually to the comfyui package. I'm doing this on a separate branch because there is a minor change in the config "api" that I just haven't synced across both projects yet outside of these two branches (https://github.com/lboklin/nixified-ai/tree/comfyui-krita-requirements and https://github.com/lboklin/nixified-cfg/tree/comfyui-krita-requirements). |
@lboklin I'm not behind my computer (to provide links) but I've been doing custom nodes in my dotfiles, and I've been doing dependencies in there. These dependencies are bundled up to become ComfyUI's dependencies. Does that solve your issue? My nixpkgs branch should also be demonstrating this. |
@LoganBarnet I'm also not at the computer atm, but I was using your nixpkgs fork as reference. If I try to do it the same way I get "... is a string with context when a set was expected". One of your comments mentioned this problem with linkFarm, so maybe an alternative to that could be concocted, but my intuition is that the problem is greater than that. I'll have a look at it again tomorrow. |
Alright, I solved the problem with custom node dependencies. For one, I had to set the derivations' So both declarative model management and custom nodes seem to work now. Something is missing for the Krita plugin still so I'm looking into that now, and after that I'll redirect my attention to the NixOS module. |
Based on feedback I've moved everything back into this main repo and added a basic package for running a krita-ai server (
Example usage of |
I have built and generated an image using
and I can confirm it is properly making full use of my AMD GPU. |
I've added a bunch of models and changed the outputs a bit. Examples of how one could use the new outputs: # run a server with absolutely all models and custom nodes:
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins (plugins: plugins)'
# all the checkpoint models but no custom nodes
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })'
# run a krita ai server with all optional models included (controlnets and such):
nix run github:lboklin/nixified-ai#krita-comfyui-server-{nvidia,amd}
# run a minimal krita ai server with a custom model set (but please actually put the expression in a file):
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; kritaServerWithModels (ms: with ms; { checkpoints = { inherit (checkpoints) colossus-xl-v6; }; ipadapter = { inherit (ipadapter) ip-adapter-faceid-plusv2_sdxl; }; loras = { inherit (loras) ip-adapter-faceid-plusv2_sdxl_lora; }; })' It seems to me like it's not uncommon for custom nodes to require certain models, so I added a passthru for that as well. reactor-node is broken because it tries to write to the models dir, but I'm leaving it there. |
Error Log
|
I think the draft status made this seem less complete than it is, so I undrafted it even if there are a couple of remaining problems. @Azeirah I pushed an unsquashed branch that excludes the recent update to comfyui in case you want to use an older version that still works. I'm not sure if flux works on that older version though. |
I managed to get the newest comfyui to run by creating an empty override for torchaudio, so instead of trying to build torchaudio incorrectly, it skips it entirely. Comfyui does run, it's not a hard dependency, but you do get a warning on startup. Flux has only been supported since the day of release of the flux model itself. The previous version was in June IIRC, and flux is definitely not supported in that version.| I really hope to learn more about nix so I can fix the rocm situation for myself. I spent like 8 hours past sunday trying to get it to work 😅 |
llama.cpp manages to use vulkan to support working on AMD and Nvidia GPUs together, without library compatibility issues. Is there not a similar thing going on in the diffusion ecosystem? ZLUDA was also taken down recently, which is a big blow to compatibility efforts https://www.phoronix.com/news/AMD-ZLUDA-CUDA-Taken-Down |
@lboklin FWIW I did steal some of your Krita related work and made a custom node bundle for it here: https://github.com/NixOS/nixpkgs/pull/268378/files#diff-d336deacc925ae350d52ed5210be2622aa537cb840b743f6852d9d20299b1f0c Though I haven't done a lot to keep up with your work. I'd like to take a look again when |
You can copy all you want! Being a maintainer is not something I wish to inflict on myself, so the best case scenario is if (parts of) my work can be of use without my name ending up in the maintainers list 😅 |
Just to give a heads up: I'm considering throwing out a lot of models in favour of making it easier to add your own. I'm even implementing utility functions for generating model libraries directly from calls to (currently only) civitai's api, so that
There are just too many models out there, so it doesn't make much sense to keep throwing them into this repo beyond those required for the krita plugin and perhaps some very common ones. The heads up part of this is that this will obviously make previously available models unavailable for those relying on them. I'm also foregoing the format of naming every single model in the set in favour of using attribute names for installPath, since that will fill the function of both identifying a model and also where it will go. After all, they're not a package set, but a collection of installable models. |
I tried to make it as pain-free as possible to adapt to the new model installation format. Let me know if there are any problems. In addition to the commit notes, comfyui and several nodes were updated to more recent versions. |
…anges: - remove the accumulated general library of models - change how models are declared/installed (with temporary back-compatibility) - phase out fetchFromHuggingFace (not very useful util) - add support for the AIR spec - add comfyui types and type utils - improve type checking, errors, warnings, and comments
I left out the model set generator because it's quite orthogonal to this PR specifically and perhaps more suited to a separate repo entirely, and honestly the utility is marginal unless someone enjoys the idea of generating an absolutely massive set of rubbish models and then import the few that may be of interest. I spent more time on it than I should have, I think. |
Fixed ReActor URL
Fix ReActor URL
This is based on other people's work, most notably @fazo96's PR and @LoganBarnett's continuation of it.
Currently it is mostly an original implementation with a focus on making it easy to spin up a server without necessarily adding it to one's system config.
Remaining problems:
Q&A
It's fully functional. Perhaps most notably lacking in this PR is a NixOS module. This implementation is simply a package, and you configure it by overriding with your own settings, models, and custom nodes. Then you simply
nix run
that overridden package.What this PR has over the nixpkgs PR is
"<modelDir>/<modelFile>" = {url = "..."; sha256 = "..."; authToken = "..."};
, whereauthToken
is optional andurl
can instead be eitherair
orfile
. (BEWARE: authToken ends up as plain text in the nix store)I don't have any concrete plans to implement a NixOS module at the moment since that overlaps a lot with the nixpkgs PR, and personally I don't mind so much having to spin it up directly when I intend to generate images, because I only use it on my PC and not on a dedicated server.
All one has to do is to go to the PR's source repo and mentally substitute mentions of nixified-ai/flake with lboklin/nixified-ai and modify the template's input ref after initialising it. Let me know if the readme and template still leave questions unanswered.