Skip to content
This repository was archived by the owner on Jan 13, 2024. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion _doc/sphinxdoc/source/api/tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,9 @@ the possibility later to only show a part of a graph.

.. autosignature:: mlprodict.plotting.plotting_onnx.plot_onnx

:ref:`onnxview <l-NB2>`
**notebook**

:ref:`onnxview <l-NB2>`, see also :ref:`numpyapionnxftrrst`.

Others
======
Expand Down
5 changes: 4 additions & 1 deletion _doc/sphinxdoc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,10 @@
'mlinsights': 'http://www.xavierdupre.fr/app/mlinsights/helpsphinx/index.html',
'mlprodict': 'http://www.xavierdupre.fr/app/mlprodict/helpsphinx/index.html',
'mlstatpy': 'http://www.xavierdupre.fr/app/mlstatpy/helpsphinx/index.html',
'numpy': 'https://numpy.org/',
'numba': 'https://numba.org/',
'numpy': ('https://www.numpy.org/',
('https://docs.scipy.org/doc/numpy/reference/generated/numpy.{0}.html', 1),
('https://docs.scipy.org/doc/numpy/reference/generated/numpy.{0}.{1}.html', 2)),
'openmp': 'https://www.openmp.org/',
'ONNX': 'https://onnx.ai/',
'onnx': 'https://github.com/onnx/onnx',
Expand Down
67 changes: 67 additions & 0 deletions _doc/sphinxdoc/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -148,6 +148,73 @@ following :epkg:`ONNX` graph.

Notebook :ref:`onnxvisualizationrst`
shows how to visualize an :epkg:`ONNX` pipeline.
The package also contains a collection of tools
to help converting code to ONNX. A short list of
them:

* **Python runtime for ONNX:**
:class:`OnnxInference <mlprodict.onnxrt.onnx_infernce.OnnxInference>`,
it is mostly used to check that an ONNX graph produces the expected output.
If it fails, it fails within a python code and not inside C++ code.
This class can also be used to call :epkg:`onnxruntime` by
using ``runtime=='onnxruntime1'``. A last runtime
``runtime=='python_compiled'`` compiles a python function equivalent
to code calling operator one by one. It makes easier to read the ONNX
graph (see :ref:`l-onnx-tutorial`).
* **Intermediate results:**
the python runtime may display all intermediate results,
their shape if `verbosity == 1`, their value if `verbosity > 10`,
see :ref:`l-onnx-tutorial`. This cannot be done with ``runtime=='onnxruntime1'``
but it is still possible to get the intermediate results
(see :meth:`OnnxInference.run <mlprodict.onnxrt.onnx_inference.OnnxInference.run>`).
The class will build all subgraphs from the inputs to every intermediate
results. If the graph has *N* operators, the cost of this will be
:math:`O(N^2)`.
* **Extract a subpart of an ONNX graph:**
hen an ONNX graph does not load, it is possible to modify, to extract
some subpart to check a tiny part of it. Function
:func`select_model_inputs_outputs
<mlprodict.onnx_tools.onnx_manipulations.select_model_inputs_outputs>`
may be used to change the inputs and/or the outputs.
* **Change the opset**: function
:func`overwrite_opset
<mlprodict.onnx_tools.onnx_manipulations.overwrite_opset>`
overwrites the opset, it is used to check for which opset (ONNX version)
a graph is valid. ...
* **Visualization in a notebook**: a magic command to display
small ONNX graph in notebooks :ref:`onnxvisualizationrst`.
* **Text visualization for ONNX:** a way to visualize ONNX graph only
with text :func:`onnx_text_plot <mlprodict.plotting.text_plot.onnx_text_plot>`.
* **Text visualization of TreeEnsemble:** a way to visualize the graph
described by a on operator TreeEnsembleRegressor or TreeEnsembleClassifier,
see :func:`onnx_text_plot <mlprodict.plotting.text_plot.onnx_text_plot_tree>`.
* **Export ONNX graph to numpy:** the numpy code produces the same
results as the ONNX graph (see :func:`export2numpy
<mlprodict.onnx_tools.onnx_export.export2numpy>`)
* **Export ONNX graph to ONNX API:** this produces a
a code based on ONNX API which replicates the ONNX graph
(see :func:`export2onnx
<mlprodict.onnx_tools.onnx_export.export2onnx>`)
* **Export ONNX graph to :epkg:`tf2onnx`:** still a function which
creates an ONNX graph but based on :epkg:`tf2onnx` API
(see :func:`export2tf2onnx
<mlprodict.onnx_tools.onnx_export.export2tf2onnx>`)
* **Numpy API for ONNX:** many functions doing computation are
written with :epkg:`numpy` and converting them to ONNX may take
quite some time for users not familiar with ONNX. This API implements
many functions from :epkg:`numpy` with ONNX and allows the user
to combine them. It is as if numpy function where exectued by an
ONNX runtime: :ref:`l-numpy-api-for-onnx`.
* **Benchmark scikit-learn models converted into ONNX:** a simple function to
benchmark ONNX against *scikit-learn* for a simple model:
:ref:`l-example-onnx-benchmark`
* **Accelerate scikit-learn prediction:**,
what if *transform* or *predict* is replaced by an implementation
based on ONNX, or a numpy version of it, would it be faster?
:ref:`l-Speedup-pca`
* **Profiling onnxruntime:** :epkg:`onnxruntime` can memorize the time
spent in each operator. The following notebook shows how to retreive
the results and display them :ref:`onnxprofileortrst`.

+----------------------+---------------------+---------------------+--------------------+------------------------+------------------------------------------------+
| :ref:`l-modules` | :ref:`l-functions` | :ref:`l-classes` | :ref:`l-methods` | :ref:`l-staticmethods` | :ref:`l-properties` |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""
@brief test log(time=9s)
@brief test log(time=33s)
"""

import unittest
Expand Down
2 changes: 1 addition & 1 deletion _unittests/ut_cli/test_cli_validate.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""
@brief test tree node (time=15s)
@brief test tree node (time=42s)
"""
import os
import sys
Expand Down
6 changes: 4 additions & 2 deletions _unittests/ut_npy/test_function_transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,8 @@ def test_function_transformer_custom_log(self):
onnx_model = to_onnx(tr, x)
oinf = OnnxInference(onnx_model)
y_onx = oinf.run({'X': x})
self.assertEqualArray(y_exp, y_onx['variable'], decimal=5)
name = oinf.output_names[0]
self.assertEqualArray(y_exp, y_onx[name], decimal=5)

@ignore_warnings((DeprecationWarning, RuntimeWarning))
def test_function_transformer_custom_logn(self):
Expand All @@ -122,7 +123,8 @@ def test_function_transformer_custom_logn(self):
onnx_model = to_onnx(tr, x)
oinf = OnnxInference(onnx_model)
y_onx = oinf.run({'X': x})
self.assertEqualArray(y_exp, y_onx['variable'], decimal=5)
name = oinf.output_names[0]
self.assertEqualArray(y_exp, y_onx[name], decimal=5)


if __name__ == "__main__":
Expand Down
3 changes: 2 additions & 1 deletion _unittests/ut_onnx_conv/test_lightgbm_tree_structure.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@
from sklearn.datasets import load_iris
from mlprodict.onnx_conv.helpers.lgbm_helper import (
modify_tree_for_rule_in_set, restore_lgbm_info)
from mlprodict.onnx_conv.parsers.parse_lightgbm import MockWrappedLightGbmBoosterClassifier
from mlprodict.onnx_conv.operator_converters.parse_lightgbm import (
MockWrappedLightGbmBoosterClassifier)
from mlprodict.onnx_conv import register_converters, to_onnx
from mlprodict.onnxrt import OnnxInference

Expand Down
Loading