Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bevy chooses CPU render device instead of integrated GPU #13113

Closed
ByteNybbler opened this issue Apr 27, 2024 · 7 comments
Closed

Bevy chooses CPU render device instead of integrated GPU #13113

ByteNybbler opened this issue Apr 27, 2024 · 7 comments
Labels
A-Rendering Drawing game state to the screen C-Bug An unexpected or incorrect behavior C-Regression Functionality that used to work but no longer does. Add a test for this! P-High This is particularly urgent, and deserves immediate attention

Comments

@ByteNybbler
Copy link

ByteNybbler commented Apr 27, 2024

Bevy version

v0.13.1 (issue is still present in v0.13.2)

[Optional] Relevant system information

  • My actual GPU: AdapterInfo { name: "Intel(R) HD Graphics 4600", vendor: 32902, device: 0, device_type: IntegratedGpu, driver: "OpenGL", driver_info: "4.3.0 - Build 20.19.15.4531", backend: Gl }
  • CPU rendering that gets used instead: AdapterInfo { name: "Microsoft Basic Render Driver", vendor: 5140, device: 140, device_type: Cpu, driver: "", driver_info: "", backend: Dx12 }
  • SystemInfo { os: "Windows 10 Home", kernel: "19045", cpu: "Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz", core_count: "4", memory: "15.9 GiB" }
  • cargo 1.77.0 (3fe68eabf 2024-02-29)

What you did

cargo run --example 2d_shapes

What went wrong

Bevy 0.13 appears to be using CPU rendering instead of using my integrated GPU.

I don't think this is an issue with my GPU drivers since Bevy seems to properly select my GPU whenever I'm using Bevy versions prior to 0.13. This is probably not a regression on Bevy's part, since the Dx12 backend on my GPU was filtered out by an upstream wgpu change (see below). However, my GPU's Gl backend is still available for use.

CPU rendering tends to be extremely slow and buggy and should only be used as a last resort. If there's any available GPU, I believe it should be chosen instead.

Bevy currently uses wgpu 0.19.3. Running wgpu 0.19.3's hello example shows the following adapters and properly selects the GPU adapter:

[2024-04-27T02:07:10Z INFO  wgpu_examples::hello] Available adapters:
[2024-04-27T02:07:10Z INFO  wgpu_examples::hello]     AdapterInfo { name: "Microsoft Basic Render Driver", vendor: 5140, device: 140, device_type: Cpu, driver: "", driver_info: "", backend: Dx12 }
[2024-04-27T02:07:10Z INFO  wgpu_examples::hello]     AdapterInfo { name: "Intel(R) HD Graphics 4600", vendor: 32902, device: 0, device_type: IntegratedGpu, driver: "", driver_info: "", backend: Gl }
[2024-04-27T02:07:10Z INFO  wgpu_examples::hello] Selected adapter: AdapterInfo { name: "Intel(R) HD Graphics 4600", vendor: 32902, device: 0, device_type: IntegratedGpu, driver: "", driver_info: "", backend: Gl }

However, running Bevy 0.13 apps chooses the CPU adapter instead:

AdapterInfo { name: "Microsoft Basic Render Driver", vendor: 5140, device: 140, device_type: Cpu, driver: "", driver_info: "", backend: Dx12 }

It's possible that Bevy requests render features that my GPU's Gl backend does not have. If that's the case, I wonder if Bevy's render feature requirements should be loosened a bit, or if some other measure should be employed to not pick CPU rendering if there are other options available.

Running wgpu 0.19.3's hello_triangle example causes the CPU adapter to be chosen even though its device request is only the following:

    let (device, queue) = adapter
        .request_device(
            &wgpu::DeviceDescriptor {
                label: None,
                required_features: wgpu::Features::empty(),
                // Make sure we use the texture resolution limits from the adapter, so we can support images the size of the swapchain.
                required_limits: wgpu::Limits::downlevel_webgl2_defaults()
                    .using_resolution(adapter.limits()),
            },
            None,
        )
        .await
        .expect("Failed to create device");

I'm unsure if this counts as an upstream issue because I don't know whether wgpu itself is supposed to be biased towards choosing GPUs over CPUs by default.

EDIT: On the latest commits of wgpu's main branch (trunk), wgpu's hello_triangle example now properly chooses my GPU due to a recent fix. See the comments below for more details.

Additional information

The issue started occurring for me after this PR: #11280 (Update to wgpu 0.19 and raw-window-handle 0.6)

Upstream, wgpu seems to have filtered out the Dx12 backend on certain GPUs due to a security vulnerability: gfx-rs/wgpu#5615

Prior to the upstream wgpu change, Bevy chose my GPU's Dx12 backend and worked fine. However, when Bevy's wgpu dependency got updated to 0.19.1 in the above PR, Bevy could no longer use my GPU's Dx12 backend. It basically makes Bevy 0.13 projects unplayable on my PC whereas it was fine prior to Bevy updating the wgpu dependency to 0.19.1.

@ByteNybbler ByteNybbler added C-Bug An unexpected or incorrect behavior S-Needs-Triage This issue needs to be labelled labels Apr 27, 2024
@JoJoJet JoJoJet added A-Rendering Drawing game state to the screen P-High This is particularly urgent, and deserves immediate attention C-Regression Functionality that used to work but no longer does. Add a test for this! and removed S-Needs-Triage This issue needs to be labelled labels Apr 27, 2024
@cwfitzgerald
Copy link

Weirdly, it should fallback to GL. I wonder if this is gfx-rs/wgpu#5535 making the GL adapter incompatible with the surface.

@ByteNybbler
Copy link
Author

Weirdly, it should fallback to GL. I wonder if this is gfx-rs/wgpu#5535 making the GL adapter incompatible with the surface.

Yes, that seems to be the case. The PR you linked appears to fix the issue on my computer, at least when running wgpu's hello_triangle example. Prior to that PR, CPU rendering is incorrectly chosen, but after that PR, my integrated GPU is correctly chosen.

Once the contents of that PR are available in a new wgpu release, Bevy should update its wgpu dependency to that release, and I think that should fix this issue.

@ByteNybbler
Copy link
Author

wgpu v0.20.0 was just released, which includes gfx-rs/wgpu#5535, which should fix this issue once Bevy updates its wgpu dependency to that version.

@mnmaita
Copy link
Member

mnmaita commented May 21, 2024

wgpu will be updated by #13186

@janhohenheim
Copy link
Member

Since WGPU 0.20 is in use now, it would be nice if you could check this on main again @ByteNybbler

@ByteNybbler
Copy link
Author

Since WGPU 0.20 is in use now, it would be nice if you could check this on main again @ByteNybbler

I'll be back at my desktop in a bit over a week, so I'll test this again at that time.

@ByteNybbler
Copy link
Author

Testing the 2D shapes example on v0.14.0-rc.3 properly selects my integrated GPU adapter! Thank you so much to everyone involved in the fix!

On the other hand, when running the 2D shapes example, I'm now getting the same error spam as #13115, but I'm going to consider the current issue resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Rendering Drawing game state to the screen C-Bug An unexpected or incorrect behavior C-Regression Functionality that used to work but no longer does. Add a test for this! P-High This is particularly urgent, and deserves immediate attention
Projects
None yet
Development

No branches or pull requests

5 participants