Skip to content

Latest commit

 

History

History
95 lines (56 loc) · 4.46 KB

README_EN.md

File metadata and controls

95 lines (56 loc) · 4.46 KB

Tengine Lite

GitHub license Build Status Build Status Test Status codecov Language grade: C/C++

中文版本

Introduction

Tengine Lite is developed by OPEN AI LAB. This project meet the demand of fast and efficient deployment of deep learning neural network models on embedded devices. In order to achieve cross-platform deployment in many AIoT applications, this project is based on the original Tengine project using C language for reconstruction, and deep frame tailoring for the characteristics of limited embedded device resources. Also, it adopts a completely separated front-end/back-end design, which makes it possible to be transplanted and deployed onto CPU, GPU, NPU and other heterogeneous computing units rapidly, conveniently. At the same time, it is compatible with the original API and model format tmfile of Tengine, which reduces the cost of evaluation and migration.

The core code of Tengine Lite consists of 4 modules:

  • dev: NN Operators back-end module, currently provides CPU code, and gradually open source GPU and NPU reference code;
  • lib: core components of the framework, including NNIR, Computational Graphs, Hardware Resources, and the scheduling and execution modules of model serializer;
  • op: NN Operators front-end module, which realizes registration and initialization of NN Operators;
  • serializer: Model decoder, which decodes binary tmfile format into serialized model parameter.

Architecture

Tengine Lite 架构

How to use

Compile

Example

  • examples provides basic classification and detection algorithm use cases, which are continuously updated according to the needs of issues.

Model Zoo

  • Tengine model zoo Model zoo samples are compatible with the original Tengine (password: hhgc).

Model Convert tool

  • Pre-compiled version: Pre-compiled model convert tool is provided on Linux system;
  • Online Convert tool: Based on WebAssembly (the models are converted locally by browsers, no private data will be uploaded);
  • Source Compilation: Refer to Tengine-Convert-Tools project, convert tool could be built by users.

Speed assessment

  • Benchmark Basic network speed assessment tool, any pull request is welcomed.

Roadmap

Acknowledgement

Tengine Lite got ideas and developed based on these projects:

License

FAQ

Tech Forum