Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for ONNX subgraphs in Burn for conditional inference computation #724

Open
antimora opened this issue Aug 29, 2023 · 3 comments
Open
Labels
feature The feature request onnx

Comments

@antimora
Copy link
Collaborator

Description

Introduce the capability to support ONNX subgraphs within the Burn framework, particularly for handling conditional inference computations. This feature aims to enable Burn to fully support modern complex ONNX models that rely on subgraphs for conditional computations.

Motivation

  1. Complete ONNX Compatibility: Many advanced models like Version 4 of the Silero VAD make use of subgraphs for various tasks. Lack of support for ONNX subgraphs would mean incomplete compatibility between Burn and ONNX.

  2. Conditional Inference: Subgraphs are particularly useful for models that require conditional computation during inference, allowing for more dynamic and efficient operations.

  3. Model Integrity and Flexibility: Providing support for ONNX subgraphs ensures that Burn can accommodate more complex model architectures without requiring modifications to the original ONNX model.

  4. Elevate Burn's Capabilities: Incorporating subgraph support would make Burn a more versatile and robust framework, increasing its appeal to a broader audience.

Proposed Implementation Steps

Backend Steps

  1. Subgraph Detection: Parse the ONNX file to identify and isolate subgraphs.

  2. Subgraph-to-Module Conversion: Convert these subgraphs into Burn-compatible modules, maintaining their conditional inference capabilities.

  3. Integration and Testing: Integrate the converted modules into Burn and conduct thorough tests to ensure they work as expected.

Code Changes

  • Some refactoring of the existing implementation will be required.

  • However, this feature should be achievable without major architectural changes.

Additional Features

  • Documentation: Offer detailed documentation and examples that show how to work with ONNX subgraphs within Burn.

  • Error Handling and Warnings: Implement robust error handling to notify users of any potential issues during the import and conversion process.

  • Logging: Provide logging mechanisms for debugging and monitoring the performance of subgraph handling.

@hkrutzer
Copy link

Thanks for creating and open-sourcing Burn! I think there is a lot of demand for running Silero VAD inside Rust, there's several projects using Whisper that want to add a VAD. Looking forward to this feature!

@willstott101
Copy link

willstott101 commented Jul 20, 2024

There is a new improved version of Silero VAD (v5) https://github.com/snakers4/silero-vad/blob/master/src/silero_vad/data/silero_vad.onnx - which might adjust priorities in terms of feature support - if the model architecture has changed in any meaningful way (I am yet to test burn & v5).

@willstott101
Copy link

Yeah, If still appears to be a blocker:

error: failed to run custom build command for `onnx-inference v0.14.0 (/home/will/repos/others/burn/examples/onnx-inference)`

Caused by:
  process didn't exit successfully: `/home/will/repos/others/burn/target/debug/build/onnx-inference-a632abc0e6b57b56/build-script-build` (exit status: 101)
  --- stdout
   INFO burn_import::onnx::to_burn: Starting to convert ONNX to Burn    
  DEBUG burn_import::onnx::to_burn: Output directory: "/home/will/repos/others/burn/target/debug/build/onnx-inference-f7ec7fb62c71ecd4/out/model/"    
   INFO burn_import::onnx::to_burn: Converting "/home/will/Downloads/silero_vad.onnx"    
  DEBUG burn_import::onnx::to_burn: Input file name: "silero_vad"    
  DEBUG burn_import::onnx::to_burn: Output file: "/home/will/repos/others/burn/target/debug/build/onnx-inference-f7ec7fb62c71ecd4/out/model/silero_vad"    
   INFO burn_import::onnx::to_burn: Generating model from "/home/will/Downloads/silero_vad.onnx"    
  DEBUG burn_import::onnx::to_burn: Development mode: false    
  DEBUG burn_import::onnx::to_burn: Output file: "/home/will/repos/others/burn/target/debug/build/onnx-inference-f7ec7fb62c71ecd4/out/model/silero_vad"    
   INFO onnx_ir::from_onnx: Parsing ONNX file: /home/will/Downloads/silero_vad.onnx    
  DEBUG onnx_ir::from_onnx: Number of nodes: 5    
  DEBUG onnx_ir::from_onnx: Number of inputs: 3    
  DEBUG onnx_ir::from_onnx: Number of initializers: 0    
  DEBUG onnx_ir::from_onnx: Number of outputs: 2    
  DEBUG onnx_ir::proto_conversion: Converting ONNX node with type "Constant"    
  DEBUG onnx_ir::from_onnx: renaming node "Constant_0"    
  DEBUG onnx_ir::from_onnx: adding node "constant1"    
  DEBUG onnx_ir::proto_conversion: Converting ONNX node with type "Equal"    
  DEBUG onnx_ir::from_onnx: renaming node "Equal_0"    
  DEBUG onnx_ir::from_onnx: adding node "equal1"    
  DEBUG onnx_ir::proto_conversion: Converting ONNX node with type "If"    
  ERROR burn_import::logger: PANIC => panicked at crates/onnx-ir/src/proto_conversion.rs:178:73:
  called `Result::unwrap()` on an `Err` value: VariantNotFound    

  --- stderr
  thread 'main' panicked at crates/onnx-ir/src/proto_conversion.rs:178:73:
  called `Result::unwrap()` on an `Err` value: VariantNotFound
  note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature The feature request onnx
Projects
None yet
Development

No branches or pull requests

3 participants