Provides the Unet calculation function of the Direct-ML backend for WebUI. Tested on Windows.
- Use Olive to convert the model to ONNX.
- Install this extension and move the Unet model to the
models/Unet-onnx
directory. - (Optional) Update Nvidia display driver to 532 or AMD display driver to 23.5.2.
- Make sure the WebUI works on the
dev
branch, select the model that contains[ORT]
in the settings. - Start generate.
- The performance of the Direct-ML backend is affected by many factors and is relatively unstable. It is recommended to adjust the power plan.
- Due to the additional hardware resources required, it is not recommended to run on devices with lower hardware specifications.
- In terms of low sampling steps, the speed is similar to the original version. As the number of steps increases, the performance difference will gradually become obvious.
- The extension lacks extensive testing, please file an issue if you encounter problems.