Skip to content

AMD GPU-NPU #25142

Open
Open
@yashgadodia12

Description

@yashgadodia12

Hi,

I want to set up a full system environment on my AMD Ryzen 9 7940HS (with Radeon 780M GPU and XDNA NPU) to run ONNX models using ONNX Runtime with multiple execution providers:

ROCmExecutionProvider (for GPU)
VitisAIExecutionProvider (for NPU)
CPUExecutionProvider

My plan is:

Install Ubuntu 22.04 (fresh install).
Install ROCm 6.4.1.
Install Vitis AI runtime (Vitis AI 3.5).
Build ONNX Runtime from source with both --use_rocm and --use_vitisai options.

I want to make sure:
This is the correct way to enable both ROCm and Vitis AI execution providers together.
There are no compatibility issues between ROCm and Vitis AI.
I can run inference on both GPU and NPU using ONNX Runtime.
Is this the correct approach? Are there any known issues or recommended best practices for this setup?

Metadata

Metadata

Assignees

No one assigned

    Labels

    ep:ROCmquestions/issues related to ROCm execution providerep:VitisAIissues related to Vitis AI execution provider

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions