Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API Error: Blender Plugin calling local Stability Diffusion Server under Linux crashes #125

Closed
Skinless-Glue-Fridge opened this issue Sep 13, 2023 · 10 comments

Comments

@Skinless-Glue-Fridge
Copy link

Describe the bug

Under Linux, the SD Server console shows an API error when Blender calls the API of the local SD Server using the AI-Render plugin. This error happens immediately when Blender Plugin sends the Command towards the SD Server.

To reproduce

On the same machine, I have setup two identical setups with Blender, AI-Render and Stable Diffusion Instances with identical versions (Blender 3.6.2 AI-Render Plugin 0.9.1, automatic1111 1.6.0, Python 3.10.12) under Windows and Linux.
I am using identical AI Model Safetensors on both operating systems.

Blender then calls the SD Server using http://localhost:7860.

  • Under Windows, the AI-Render plugin runs perfectly with SD and Blender.
  • Under Linux, the SD Server shows an API Error message when Blender calls the API of the local SD Server using the AI-Render plugin. This error message ist also shown in the Blender AI-Render Plugin.

Error log

Error Message in Blender AI-Render Plugin:

An error occured in the Automatic1111 Stable Diffusion server. Check the server logs fr more info. (see below)


Here is the error message shown on the console of Stable Diffusion Server:

*** API error: POST: http://127.0.0.1:7860/sdapi/v1/img2img {'error': 'TypeError', 'detail': '', 'body': '', 'errors': "'<' not supported between instances of 'int' and 'NoneType'"}
Traceback (most recent call last):
File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 98, in receive
return self.receive_nowait()
File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 93, in receive_nowait
raise WouldBlock
anyio.WouldBlock

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 78, in call_next
    message = await recv_stream.receive()
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 118, in receive
    raise EndOfStream
anyio.EndOfStream

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 187, in exception_handling
    return await call_next(request)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 84, in call_next
    raise app_exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 70, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 108, in __call__
    response = await self.dispatch_func(request, call_next)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 151, in log_and_time
    res: Response = await call_next(req)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 84, in call_next
    raise app_exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 70, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 84, in __call__
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__
    await responder(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__
    await self.app(scope, receive, self.send_with_gzip)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
    raise e
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/routing.py", line 237, in app
    raw_response = await run_endpoint_function(
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/routing.py", line 165, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 403, in img2imgapi
    self.default_script_arg_img2img = self.init_default_script_args(script_runner)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 300, in init_default_script_args
    if last_arg_index < script.args_to:
TypeError: '<' not supported between instances of 'int' and 'NoneType'

Environment

  • Operating system (Windows/Mac/Linux): Windows 10, Ubuntu 22.04
  • Blender version (upper right corner of splash screen): 3.6.2 (on both operating systems)
  • AI Render version (find in Preferences > Add-ons): 0.9.1 (on both operating systems)
  • Stable Diffusion 1.6.0 (on both operating systems)

Screenshots/video

Screenshot of Blender and AI-Render Plugin: https://ibb.co/yhQTmk3

Log of SD Server under Ubuntu 22.04:

################################################################
Create and activate python venv
################################################################

################################################################
Launching launch.py...
################################################################
Using TCMalloc: libtcmalloc_minimal.so.4
Python 3.10.12 (main, Jun 11 2023, 05:26:28) [GCC 11.4.0]
Version: v1.6.0
Commit hash: 5ef669de080814067961f28357256e8fe27544f4
Installing sd-webui-controlnet requirement: changing opencv-python version from 4.7.0.72 to 4.8.0
Checking roop requirements
Install insightface==0.7.3
Installing sd-webui-roop requirement: insightface==0.7.3
Install onnx==1.14.0
Installing sd-webui-roop requirement: onnx==1.14.0
Install onnxruntime==1.15.0
Installing sd-webui-roop requirement: onnxruntime==1.15.0
Install opencv-python==4.7.0.72
Installing sd-webui-roop requirement: opencv-python==4.7.0.72
Installing SD-CN-Animation requirement: scikit-image
Launching Web UI with arguments: --xformers --port 7860 --api
Civitai Helper: Get Custom Model Folder
[-] ADetailer initialized. version: 23.9.2, num models: 9
2023-09-13 19:14:10,975 - ControlNet - INFO - ControlNet v1.1.410
ControlNet preprocessor location: /home/bornd/AI/stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/downloads
2023-09-13 19:14:11,057 - ControlNet - INFO - ControlNet v1.1.410
2023-09-13 19:14:11,227 - roop - INFO - roop v0.0.2
2023-09-13 19:14:11,262 - roop - INFO - roop v0.0.2
Loading weights [c4b501713f] from /home/bornd/AI/stable-diffusion-webui/models/Stable-diffusion/SDXL/Realistic/juggernautXL_version3.safetensors
[VRAMEstimator] No stats available, run benchmark first
Deforum ControlNet support: enabled
Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().
Startup time: 21.4s (prepare environment: 13.8s, import torch: 1.6s, import gradio: 0.5s, setup paths: 0.7s, initialize shared: 0.1s, other imports: 0.3s, load scripts: 2.5s, create ui: 1.2s, gradio launch: 0.4s).
Creating model from config: /home/bornd/AI/stable-diffusion-webui/repositories/generative-models/configs/inference/sd_xl_base.yaml
Loading VAE weights specified in settings: /home/bornd/AI/stable-diffusion-webui/models/VAE/sdxl_vae.safetensors
Applying attention optimization: xformers... done.
Model loaded in 6.6s (load weights from disk: 1.9s, create model: 2.9s, apply weights to model: 1.6s).

*** API error: POST: http://127.0.0.1:7860/sdapi/v1/img2img {'error': 'TypeError', 'detail': '', 'body': '', 'errors': "'<' not supported between instances of 'int' and 'NoneType'"}
Traceback (most recent call last):
File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 98, in receive
return self.receive_nowait()
File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 93, in receive_nowait
raise WouldBlock
anyio.WouldBlock

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 78, in call_next
    message = await recv_stream.receive()
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/streams/memory.py", line 118, in receive
    raise EndOfStream
anyio.EndOfStream

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 187, in exception_handling
    return await call_next(request)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 84, in call_next
    raise app_exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 70, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 108, in __call__
    response = await self.dispatch_func(request, call_next)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 151, in log_and_time
    res: Response = await call_next(req)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 84, in call_next
    raise app_exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 70, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 84, in __call__
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__
    await responder(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__
    await self.app(scope, receive, self.send_with_gzip)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
    raise e
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/routing.py", line 237, in app
    raw_response = await run_endpoint_function(
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/routing.py", line 165, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/home/bornd/AI/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 403, in img2imgapi
    self.default_script_arg_img2img = self.init_default_script_args(script_runner)
  File "/home/bornd/AI/stable-diffusion-webui/modules/api/api.py", line 300, in init_default_script_args
    if last_arg_index < script.args_to:
TypeError: '<' not supported between instances of 'int' and 'NoneType'

Additional information

No response

@benrugg
Copy link
Owner

benrugg commented Sep 13, 2023

Thanks for posting such a detailed error with the explanation, Windows comparison and the log! Unfortunately this is an error happening in Automatic1111, so I won't be much help. I would suggest posting it on their github.

@benrugg benrugg closed this as completed Sep 13, 2023
@Skinless-Glue-Fridge
Copy link
Author

Skinless-Glue-Fridge commented Sep 13, 2023

I dont know, what exact API post the AI-Render is sending to Stable Diffusion. So what should I tell them ?

@benrugg
Copy link
Owner

benrugg commented Sep 13, 2023

I would just post that same info and those error logs. Hopefully someone will be able to see what the error is just from the logs.

@Skinless-Glue-Fridge
Copy link
Author

is there a chance to record or log the call from AI-Render to SD ?

@Skinless-Glue-Fridge
Copy link
Author

just opened a ticket at automatic1111@github:
AUTOMATIC1111/stable-diffusion-webui#13235

@benrugg
Copy link
Owner

benrugg commented Sep 13, 2023

Great - I hope they help quickly! Here's what you could do to log what AI Render is sending:

  1. Edit the file sd_backends/automatic1111_api.py in AI Render. (You can find AI Render's location on the Blender Add-on Preferences screen).
  2. Find the line def do_post(url, data): and add print(data) as the next line (indenting it like the other lines)
  3. Launch Blender from the command line so you can see the output. Probably something like this in a terminal window: cd /Applications/Blender.app/Contents/MacOS && ./Blender. (See this for help, if you need it).
  4. Render with AI Render
  5. View the logged output in the terminal window

@Skinless-Glue-Fridge
Copy link
Author

Thanks. I have attached the output of the "print(data)" log of your AI-Render Engine.

Attached you find the call:

output.txt

@benrugg
Copy link
Owner

benrugg commented Sep 14, 2023

Cool, that should help the Automatic1111 people debug it. One potential thought I had just now is that you could try setting a different sampler in the AI Render interface. It's possible that DPM++ 2M is not behaving the same way on Linux. (It's a long shot, but it's something you could try)

@Skinless-Glue-Fridge
Copy link
Author

Thanks for the idea, but identical result ;-)

@benrugg
Copy link
Owner

benrugg commented Sep 14, 2023

Gotcha

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants