Skip to content

This is an onnx backend inference implementation based on the mmdeploy open source code for adapting its own tasks.

License

Notifications You must be signed in to change notification settings

TheWangYang/my_mmdeploy_onnx_demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

my_mmdeploy_onnx_demo

This is an onnx backend inference implementation based on the mmdeploy open source code for adapting its own tasks.

<<<<<<< HEAD

Usage and Deployment Instructions

Compilation instructions

src files compilation

ef9016b3dcf9aa164fd1cf7af351663cc77d8e39

cmake -G "Visual Studio 16 2019" -B "./build/" -DTENSORRT_DIR="E:\\PyCharmWorkPlace\\my_mmdeploy_onnx_demo\\thirdparty\\tensorrt" -DONNXRUNTIME_DIR="E:\\PyCharmWorkPlace\\my_mmdeploy_onnx_demo\\thirdparty\\onnxruntime"

<<<<<<< HEAD

Contributors

  1. Yangyang Wang =======

Usage and Deployment Instructions

ef9016b3dcf9aa164fd1cf7af351663cc77d8e39

Please refer to my CSDN blog:使用MMDeploy(预编译包)转换MMxx(MMDeploy支持库均可)pth权重到onnx,并使用C++ SDK加载onnx得到dll动态链接库,实现在windows平台中调用(linux也适用)

Contributors

  1. Yangyang Wang

Acknowledgment

<<<<<<< HEAD

  1. MMDeploy

=======

  1. MMDeploy

ef9016b3dcf9aa164fd1cf7af351663cc77d8e39

About

This is an onnx backend inference implementation based on the mmdeploy open source code for adapting its own tasks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published