Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blur Background setting causing significant GPU usage above certain settings #271

Closed
Sparronator9999 opened this issue Apr 23, 2023 · 10 comments · Fixed by #276
Closed

Blur Background setting causing significant GPU usage above certain settings #271

Sparronator9999 opened this issue Apr 23, 2023 · 10 comments · Fixed by #276
Labels
bug Something isn't working
Milestone

Comments

@Sparronator9999
Copy link
Contributor

Describe the bug

When setting the Blur background factor setting high enough, the background blur effect will use a very significant amount of GPU resources (this is more apparent on integrated graphics).

To Reproduce

Steps to reproduce the behavior:

  1. Open Task Manager and OBS Studio
  2. Add a Background Removal filter to any source (if you haven't already).
  3. Set Blur Background Factor to a high setting (settings as low as 6 max out my integrated graphics, while my discrete graphics can handle up to around 12 before lagging significantly)
  4. Observe (in Task Manager) the High GPU usage

Expected behavior

The background blur does not use this much GPU time.

Screenshots

Before (tested using 39adf39) After (latest release)
before after

Background blur factor was set to 50 (before) and 12 (after). These settings give about the same amount of blur when switching versions, from my testing.

Log and Crash Report

2023-04-24 07-53-54.txt

2023-04-24 07-55-51.txt

Desktop (please complete the following information):

  • OS: Windows 10 (22H2)
  • Browser: N/A
  • Plugin Version: Latest (v0.5.17)
  • OBS Version: Latest (v29.0.2)

Additional context

Commit 39adf39 and earlier do not seem to exhibit this behaviour, which suggests that the newly-introduced GPU blending in the latest release is actually performing worse than before

@umireon
Copy link
Member

umireon commented Apr 24, 2023

Originally the background blur was performed on CPU but now it is performed on GPU so increasing the GPU load is expected.
Blurring is a very heavy computation, though.

@Sparronator9999
Copy link
Contributor Author

I realise that, but this blur in particular seems to be more resource-intensive (even though it's being done on the GPU now) than it was when it was done on the CPU for around the same amount of blur.

@umireon
Copy link
Member

umireon commented Apr 24, 2023

The current blur implementation is a naïve one, and we should implement more efficient algorithm such as ones on the following article:

https://www.intel.com/content/www/us/en/developer/articles/technical/an-investigation-of-fast-real-time-gpu-based-image-blur-algorithms.html

@royshil
Copy link
Collaborator

royshil commented Apr 24, 2023

we could add an option to choose GPU or CPU blur... to fit everyone's needs
for some (many) - saving the CPU cycles is more important, for others its the GPU

@royshil
Copy link
Collaborator

royshil commented Apr 24, 2023

The current blur implementation is a naïve one, and we should implement more efficient algorithm such as ones on the following article:

https://www.intel.com/content/www/us/en/developer/articles/technical/an-investigation-of-fast-real-time-gpu-based-image-blur-algorithms.html

i've read this article in the past researching efficient blur

it all comes down to kernel separability. with a "two pass" (instead of current "one pass") we can do box or even gaussian blur pretty fast...
i can implement it, but i think the 2 rendering passes may be slower than what we have now

@gretel
Copy link

gretel commented Apr 24, 2023

it all comes down to kernel separability. with a "two pass" (instead of current "one pass") we can do box or even gaussian blur pretty fast... i can implement it, but i think the 2 rendering passes may be slower than what we have now

thanks, curious to try.

@umireon
Copy link
Member

umireon commented Apr 24, 2023

@royshil I have implemented GPU Kawase blur on #276. That is blazingly fast when the size of the blur is very large.

@Sparronator9999
Copy link
Contributor Author

we could add an option to choose GPU or CPU blur... to fit everyone's needs for some (many) - saving the CPU cycles is more important, for others its the GPU

I 100% agree with this. For people with weak GPUs (i.e. integrated graphics), there should be an option to choose which device does the background blur (similar to the Inference Device setting).

@royshil I have implemented GPU Kawase blur on #276. That is blazingly fast when the size of the blur is very large.

@umireon, can you please provide Windows testing binaries?

@umireon
Copy link
Member

umireon commented Apr 25, 2023

@Sparronator9999
Copy link
Contributor Author

@umireon Works great (even on integrated graphics)!

@umireon umireon added this to the v0.5.18 milestone Apr 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants