Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option to flush queue instead of waiting for completion. #1864

Merged
merged 2 commits into from
Jun 13, 2024

Conversation

ArthurBrussee
Copy link
Contributor

Checklist

  • Confirmed that run-checks all script has been executed.

Related Issues/PRs

This adds the last bit of infrastructure needed for custom burn<->wgpu interop, after initializing from an existing wgpu server (#1788), and getting WGPU resources (#1861).

Changes

After this PR, the setup for custom interop roughly looks like:

init_existing_device(adapter, device, wgpu_queue);

// Do some burn operations.
let x = Tensor::from_floats();
let y = x * 2 + 5.0; // Some calculations we really need in our wgpu code.

// Or however you have access to the client.
let client = y.into_primitive().client;

// Flush burn operations - they now live on the wgpu queue.
client.submit();

// Get a reference to y binding.
let y_wgpu_binding= client.get_resource(y);

// Run custom commands on the queue. wgpu makes guarantees that any resources have been
// calculated up to this point.
run_my_custom_wgpu_code(wgpu_queue, y_wgpu_binding);

This does require a seperate encoder for your custom wgpu code. I think that's ok - a custom wgpu app might have it's own already anyway - but in the future we could expose some notion of the servers encoder to share resources.

Comment on lines 29 to 30
/// Submit the current queued tasks to the server.
fn submit(&self);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to have a submit function here or pass an option to the sync method? I'm unsure which one is the most intuitive. It's a bit weird to have an exposed submit function along execute and sync.

fn sync(&self, options: SyncOptions);

#[derive(Clone, Copy)]
enum SyncOptions {
    /// Sync the execution queue.
    Queue,
    //// Sync the execution queue and wait for all tasks to be completed.
    Full,
}

Copy link
Contributor Author

@ArthurBrussee ArthurBrussee Jun 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah alright! Yeah that might be more consistent.

It does slightly complicate things as for better or worse it kind of forces this to be exposed on the backend trait.

It's also not quite clear where to keep something like SyncOptions as it's used in a bunch of crates that can't really depend on each other.

Lastly, changed the names a bit ("type" instead of options as there might be other options - flush&wait vs queue&full - the former being the more common terminlogy in the graphics world anyway).

Lmk how this looks!

@ArthurBrussee ArthurBrussee changed the title Add submit to the client API Add an option to only flush the queue instead of waiting for it to be done Jun 10, 2024
@ArthurBrussee ArthurBrussee changed the title Add an option to only flush the queue instead of waiting for it to be done Add option to flush queue instead of waiting for completion. Jun 10, 2024
Copy link

codecov bot commented Jun 10, 2024

Codecov Report

Attention: Patch coverage is 42.30769% with 45 lines in your changes missing coverage. Please review.

Project coverage is 86.10%. Comparing base (4b174a8) to head (85c6cb1).
Report is 4 commits behind head on main.

Files Patch % Lines
crates/burn-candle/src/backend.rs 0.00% 15 Missing ⚠️
crates/burn-tch/src/backend.rs 0.00% 14 Missing ⚠️
crates/burn-compute/src/channel/mpsc.rs 0.00% 7 Missing ⚠️
crates/burn-autodiff/src/backend.rs 0.00% 2 Missing ⚠️
crates/burn-compute/src/channel/cell.rs 0.00% 2 Missing ⚠️
crates/burn-fusion/src/backend.rs 0.00% 2 Missing ⚠️
crates/burn-jit/src/backend.rs 0.00% 2 Missing ⚠️
crates/burn-tensor/src/tensor/backend/base.rs 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1864      +/-   ##
==========================================
- Coverage   86.24%   86.10%   -0.14%     
==========================================
  Files         774      776       +2     
  Lines       90159    90410     +251     
==========================================
+ Hits        77759    77851      +92     
- Misses      12400    12559     +159     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! 👏

@nathanielsimard nathanielsimard merged commit c873d87 into tracel-ai:main Jun 13, 2024
14 checks passed
@ArthurBrussee ArthurBrussee deleted the submit_client branch July 2, 2024 16:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants