New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DOC] Codebase walkthrough with Vector add example #2206
Conversation
docs/dev/codebase_walkthrough.rst
Outdated
- ``topi`` - Compute definitions and backend schedules for standard neural network operators. | ||
- ``nnvm`` - C++ code and Python frontend for graph optimization and compilation. Depends on three directories above. | ||
|
||
Using standard Deep Learning terminologies, ``nnvm`` is the component that manages a computational graph, and nodes in a graph are compiled and executed using infrastructures implemented in ``src`` and ``python``. Operators corresponding to each node are registered in ``nnvm``. Registration can be done via C++ or Python. Implemenations for operators are in ``topi``, and they are also coded in either C++ or Python. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implemenations -> Implementations
docs/dev/codebase_walkthrough.rst
Outdated
|
||
Using standard Deep Learning terminologies, ``nnvm`` is the component that manages a computational graph, and nodes in a graph are compiled and executed using infrastructures implemented in ``src`` and ``python``. Operators corresponding to each node are registered in ``nnvm``. Registration can be done via C++ or Python. Implemenations for operators are in ``topi``, and they are also coded in either C++ or Python. | ||
|
||
When an user invokes graph compilation by ``nnvm.compiler.build(...)``, the following sequence of actions happens for each node in the graph: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a user
docs/dev/codebase_walkthrough.rst
Outdated
|
||
The process of ``tvm.build()`` can be divided into two steps: | ||
|
||
- Lowering, where an high level, initial loop nest structures are transformed into a final, low level IR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a high level
docs/dev/codebase_walkthrough.rst
Outdated
|
||
One of the interesting aspects of TVM codebase is that interop between C++ and Python is not unidirectional. Typically, all code that do heavy liftings are implemented in C++, and Python bindings are provided for user interface. This is also true in TVM, but in TVM codebase, C++ code also call into functions defined in a Python module. For example, the convolution operator is implemented in Python, and its implementation is invoked from C++ code in nnvm. | ||
|
||
At the time of writing (Nov. 30, 2018), there is an going effort to reimplement functionality offered by ``nnvm`` in a new intermidiate representation called Relay. New Relay code resides in ``src/relay`` and ``python/tvm/relay``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
an ongoing effort
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
intermidiate -> intermediate
docs/dev/codebase_walkthrough.rst
Outdated
B = tvm.placeholder((n,), name='B') | ||
C = tvm.compute(A.shape, lambda i: A[i] + B[i], name="C") | ||
|
||
Here, types of ``A``, ``B``, ``C`` are ``tvm.tensor.Tensor``, defined in ``python/tvm/tensor.py``. The Python ``Tensor`` is backed by C++ ``Tensor``, implemented in ``include/tvm/tensor.h`` and ``src/lang/tensor.cc``. All Python types in TVM can be thought of as a handle to the underlining C++ type with the same name. If you look at the definition of Python ``Tensor`` type below, you can see it is an subclass of ``NodeBase``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
an subclass -> a subclass
@vinx13 Thanks. Fixed. |
Would probably be good to port this to Relay soonish too. One of the Relay developers can probably manage that though, I will add a note to the tracking issue. |
docs/dev/codebase_walkthrough.rst
Outdated
B = tvm.placeholder((n,), name='B') | ||
C = tvm.compute(A.shape, lambda i: A[i] + B[i], name="C") | ||
|
||
Here, types of ``A``, ``B``, ``C`` are ``tvm.tensor.Tensor``, defined in ``python/tvm/tensor.py``. The Python ``Tensor`` is backed by C++ ``Tensor``, implemented in ``include/tvm/tensor.h`` and ``src/lang/tensor.cc``. All Python types in TVM can be thought of as a handle to the underlining C++ type with the same name. If you look at the definition of Python ``Tensor`` type below, you can see it is a subclass of ``NodeBase``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
underlining
-> underlying
7cd3e98
to
1e40e0c
Compare
The first stab at #2160.
@tqchen @yzhliu @merrymercy @Ravenwater please review.