Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add prompt emphasis and length check #27

Closed
ssube opened this issue Jan 8, 2023 · 3 comments · Fixed by #93
Closed

add prompt emphasis and length check #27

ssube opened this issue Jan 8, 2023 · 3 comments · Fixed by #93
Labels
status/fixed issues that have been fixed and released type/feature new features
Milestone

Comments

@ssube
Copy link
Owner

ssube commented Jan 8, 2023

Implement ((emphasis)) and [deemphasis] following the same syntax from diffusers, https://github.com/huggingface/diffusers/blob/main/examples/community/lpw_stable_diffusion_onnx.py#L78, and as documented in https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Features#attentionemphasis.

  • (increase:N)
    • (increase:1.1)
    • ((increase:1.21))
  • [decrease:N]
    • [decrease:1.1]
    • [decrease:1.21]

Optionally support the {} syntax and 1.05 base weight as well.

@ssube ssube added status/new issues that have not been confirmed yet type/feature new features labels Jan 8, 2023
@ssube ssube modified the milestones: v0.3, v0.4 Jan 8, 2023
@ssube
Copy link
Owner Author

ssube commented Jan 12, 2023

https://github.com/ssube/onnx-web/tree/feat/27-lpw enables the long prompt weighting pipeline, but doesn't work, with one of two errors depending on whether the negative prompt was supplied:

[2023-01-11 22:07:04,391] ERROR in app: Exception on /txt2img [POST]                                                                                                                            
Traceback (most recent call last):                                                                                                                                                              
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 2525, in wsgi_app                                                                        
    response = self.full_dispatch_request()                                                                                                                                                     
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1822, in full_dispatch_request                                                           
    rv = self.handle_user_exception(e)                                                                                                                                                          
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1820, in full_dispatch_request                                                           
    rv = self.dispatch_request()                                                                                                                                                                
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1796, in dispatch_request                                                                
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)                                                                                                                    
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_web\serve.py", line 331, in txt2img                                                                                                
    image = pipe.text2img(                                                                                                                                                                      
TypeError: OnnxStableDiffusionLongPromptWeightingPipeline.text2img() got multiple values for argument 'negative_prompt'

or

[2023-01-11 22:07:32,838] ERROR in app: Exception on /txt2img [POST]                             
Traceback (most recent call last):                                                               
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 2525, in wsgi_app
    response = self.full_dispatch_request()                                                      
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1822, in full_dispatch_request
    rv = self.handle_user_exception(e)                                                           
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1820, in full_dispatch_request
    rv = self.dispatch_request()                                                                 
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\app.py", line 1796, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)                     
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_web\serve.py", line 331, in txt2img 
    image = pipe.text2img(                                                                       
  File "C:\Users\ssube/.cache\huggingface\modules\diffusers_modules\git\lpw_stable_diffusion_onnx.py", line 942, in text2img
    return self.__call__(                                                                        
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)                                                                 
  File "C:\Users\ssube/.cache\huggingface\modules\diffusers_modules\git\lpw_stable_diffusion_onnx.py", line 766, in __call__
    text_embeddings = self._encode_prompt(                                                       
  File "C:\Users\ssube/.cache\huggingface\modules\diffusers_modules\git\lpw_stable_diffusion_onnx.py", line 519, in _encode_prompt
    if batch_size != len(negative_prompt):                                                       
TypeError: object of type 'int' has no len()

It seems like something else is passing an integer negative prompt, but I'm not sure what/where.

@ssube ssube added status/blocked in-progress issues that are blocked by a dependency and removed status/new issues that have not been confirmed yet labels Jan 12, 2023
@ssube ssube modified the milestones: v0.4, v0.5, v0.6 Jan 12, 2023
@ssube
Copy link
Owner Author

ssube commented Jan 15, 2023

The object of type 'int' has no len() error was due to height/width being kwargs on the LPW pipeline. I have it working with some schedulers, but it's failing with others. For example, DDIM works, but Euler A fails with:

reusing existing pipeline
changing pipeline scheduler
  0%|                                                                                                                                                                   | 0/50 [00:00<?, ?it/s]1
0.2.2.16 - - [14/Jan/2023 22:40:54] "GET /api/ready?output=txt2img_1339950357_eec1cb2170e97d2ab97050cce51482cbb86801febee00c25185298713475ad60.png HTTP/1.1" 200 -
  0%|                                                                                                                                                                   | 0/50 [00:00<?, ?it/s]
exception calling callback for <Future at 0x299d0000a60 state=finished raised TypeError>
Traceback (most recent call last):
  File "C:\Users\ssube\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\_base.py", line 342, in _invoke_callbacks
    callback(self)
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask_executor\executor.py", line 37, in propagate_exceptions_callback
    raise exc
  File "C:\Users\ssube\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask_executor\executor.py", line 29, in wrapper
    return fn(*args, **kwargs)
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\flask\ctx.py", line 182, in wrapper
    return ctx.app.ensure_sync(f)(*args, **kwargs)
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_web\serve.py", line 236, in run_txt2img_pipeline
    image = pipe.text2img(
  File "C:\Users\ssube/.cache\huggingface\modules\diffusers_modules\local\community-lpw.py", line 943, in text2img
    return self.__call__(
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\ssube/.cache\huggingface\modules\diffusers_modules\local\community-lpw.py", line 834, in __call__
    scheduler_output = self.scheduler.step(
  File "C:\Users\ssube\stabdiff\onnx-try-2\onnx-web\api\onnx_env\lib\site-packages\diffusers\schedulers\scheduling_euler_ancestral_discrete.py", line 239, in step
    noise = torch.randn(model_output.shape, dtype=model_output.dtype, device=device, generator=generator).to(
TypeError: randn() received an invalid combination of arguments - got (torch.Size, device=torch.device, dtype=torch.dtype, generator=numpy.random.mtrand.RandomState), but expected one of:     
 * (tuple of ints size, *, torch.Generator generator, tuple of names names, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
 * (tuple of ints size, *, torch.Generator generator, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
 * (tuple of ints size, *, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
 * (tuple of ints size, *, tuple of names names, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)

Working:

  • DDIM
  • DPM Multistep
  • DPM Singlestep
  • Heun
  • KDPM2
  • LMS
  • PNDM

Not working:

  • DDPM
  • Euler
  • Euler A
  • KDPM2 A

Karras Ve is failing with a different error, which is probably unrelated:

TypeError: KarrasVeScheduler.step() missing 1 required positional argument: 'sample_hat'

@ssube ssube added status/progress issues that are in progress and have a branch and removed status/blocked in-progress issues that are blocked by a dependency labels Jan 15, 2023
@ssube ssube mentioned this issue Feb 5, 2023
@ssube ssube added status/fixed issues that have been fixed and released and removed status/progress issues that are in progress and have a branch labels Feb 5, 2023
@ssube ssube closed this as completed in #93 Feb 5, 2023
@ssube ssube reopened this Feb 5, 2023
@ssube
Copy link
Owner Author

ssube commented Feb 5, 2023

This was merged, but I forgot it broke some schedulers, so now that needs to be fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/fixed issues that have been fixed and released type/feature new features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant