Skip to content

Integrate onnxruntime-extensions into onnxruntime.#8143

Merged
wenbingl merged 2 commits intomicrosoft:masterfrom
Zuwei-Zhao:users/zuweizhao/integrate_ort_customops
Jul 1, 2021
Merged

Integrate onnxruntime-extensions into onnxruntime.#8143
wenbingl merged 2 commits intomicrosoft:masterfrom
Zuwei-Zhao:users/zuweizhao/integrate_ort_customops

Conversation

@Zuwei-Zhao
Copy link
Contributor

@Zuwei-Zhao Zuwei-Zhao commented Jun 24, 2021

Description: Integrate onnxruntime-extensions into onnxruntime.

Motivation and Context

  • Integrate onnxruntime-extensions into onnxruntime to extend the capability of the ONNX conversion and inference. https://github.com/microsoft/onnxruntime-extensions.
  • Add argument --enable_onnxruntime_extensions to enable custom operators in onnxruntime-extensions when build onnxruntime.

@Zuwei-Zhao Zuwei-Zhao requested a review from a team as a code owner June 24, 2021 01:23
@snnn
Copy link
Contributor

snnn commented Jun 24, 2021

@wenbingl please help review.

@wenbingl
Copy link
Contributor

/azp run

@azure-pipelines
Copy link

You have several pipelines (over 10) configured to build pull requests in this repository. Specify which pipelines you would like to run by using /azp run [pipelines] command. You can specify multiple pipelines using a comma separated list.

@wenbingl
Copy link
Contributor

/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux CPU x64 NoContribops CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux GPU TensorRT CI Pipeline,Windows WebAssembly CI Pipeline

@azure-pipelines
Copy link

Azure Pipelines successfully started running 6 pipeline(s).

@wenbingl
Copy link
Contributor

@fs-eire, please have a review as well

@wenbingl wenbingl requested a review from fs-eire June 24, 2021 21:59
@wenbingl
Copy link
Contributor

Description: Integrate onnxruntime-extensions into onnxruntime.

Motivation and Context

  • Integrate onnxruntime-extensions into onnxruntime to extend the capability of the ONNX conversion and inference. https://github.com/microsoft/onnxruntime-extensions.
  • Add argument --enable_onnxruntime_extensions to enable custom operators in onnxruntime-extensions when build onnxruntime.

Add more details that why the custom-ops should be static linked.

@wenbingl
Copy link
Contributor

/azp run Linux Nuphar CI Pipeline,Linux OpenVINO CI Pipeline ,MacOS CI Pipeline ,MacOS NoContribops CI Pipeline,Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline ,orttraining-amd-gpu-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed

@azure-pipelines
Copy link

You have several pipelines (over 10) configured to build pull requests in this repository. Specify which pipelines you would like to run by using /azp run [pipelines] command. You can specify multiple pipelines using a comma separated list.

@wenbingl
Copy link
Contributor

/azp run Linux Nuphar CI Pipeline,Linux OpenVINO CI Pipeline ,MacOS CI Pipeline ,MacOS NoContribops CI Pipeline,Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline

@wenbingl
Copy link
Contributor

/azp run orttraining-amd-gpu-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed

@azure-pipelines
Copy link

Azure Pipelines successfully started running 7 pipeline(s).

@azure-pipelines
Copy link

Azure Pipelines successfully started running 4 pipeline(s).

…egrate_ort_customops

# Conflicts:
#	include/onnxruntime/core/session/onnxruntime_c_api.h
#	onnxruntime/core/session/onnxruntime_c_api.cc
#	onnxruntime/core/session/ort_apis.h
@snnn
Copy link
Contributor

snnn commented Jun 26, 2021

/azp run Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux CPU x64 NoContribops CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline, Linux Nuphar CI Pipeline, Linux OpenVINO CI Pipeline, MacOS CI Pipeline

@snnn
Copy link
Contributor

snnn commented Jun 26, 2021

/azp run MacOS NoContribops CI Pipeline, Windows CPU CI Pipeline, Windows GPU CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows WebAssembly CI Pipeline, orttraining-amd-gpu-ci-pipeline, orttraining-linux-ci-pipeline, orttraining-linux-gpu-ci-pipeline, orttraining-ortmodule-distributed

@azure-pipelines
Copy link

Azure Pipelines successfully started running 8 pipeline(s).

@azure-pipelines
Copy link

Azure Pipelines successfully started running 9 pipeline(s).

@Zuwei-Zhao
Copy link
Contributor Author

/azp run Windows CPU CI Pipeline

@azure-pipelines
Copy link

Commenter does not have sufficient privileges for PR 8143 in repo microsoft/onnxruntime

@Zuwei-Zhao
Copy link
Contributor Author

Commenter does not have sufficient privileges for PR 8143 in repo microsoft/onnxruntime

Hi @wenbingl and @snnn , could you please add the access for me so that I could re-run the failed pipelines? Thanks.

@Zuwei-Zhao
Copy link
Contributor Author

/azp run Windows CPU CI Pipeline

@azure-pipelines
Copy link

Commenter does not have sufficient privileges for PR 8143 in repo microsoft/onnxruntime

@wenbingl
Copy link
Contributor

/azp run Windows CPU CI Pipeline,Windows CPU CI Pipeline (build x86_release)

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@microsoft microsoft deleted a comment from azure-pipelines bot Jun 29, 2021
@wenbingl
Copy link
Contributor

@pranavsharma, can you take a look at this PR as well?

For ONNXRuntime Web project, there is no way to load DLL of onnxruntime-extensions, and ONNXRuntime Mobile, for sake of the footprint, it would be better to have the library static-linked.

@wenbingl wenbingl merged commit b46310b into microsoft:master Jul 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants