Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Backend] Add YOLOv5、PPYOLOE and PP-Liteseg for RV1126 #647

Merged
merged 21 commits into from
Dec 5, 2022
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
750a0b5
add yolov5 and ppyoloe for rk1126
yeliang2258 Nov 21, 2022
c986c4b
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Nov 21, 2022
3ce8f23
Merge branch 'develop' into rk1126_demo_1_dev
jiangjiajun Nov 22, 2022
409dc85
Merge branch 'develop' into rk1126_demo_1_dev
jiangjiajun Nov 22, 2022
7b1d546
update code, rename rk1126 to rv1126
yeliang2258 Nov 24, 2022
a2f1c6a
Merge branch 'rk1126_demo_1_dev' of https://github.com/yeliang2258/Fa…
yeliang2258 Nov 24, 2022
3d82a4e
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Nov 24, 2022
442f7dc
add PP-Liteseg
yeliang2258 Nov 30, 2022
7a306a7
update lite lib
yeliang2258 Nov 30, 2022
af0b598
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Nov 30, 2022
66de8ac
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Nov 30, 2022
336675a
updade doc for PPYOLOE
yeliang2258 Nov 30, 2022
6d1e5a5
update doc
yeliang2258 Nov 30, 2022
c759eaa
fix docs
yeliang2258 Nov 30, 2022
7a5044d
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Nov 30, 2022
cc34af4
fix doc and examples
yeliang2258 Dec 5, 2022
5c42a38
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Dec 5, 2022
cade7ea
update code
yeliang2258 Dec 5, 2022
07497c9
uodate doc
yeliang2258 Dec 5, 2022
d8dc6f5
Merge remote-tracking branch 'upstream/develop' into rk1126_demo_1_dev
yeliang2258 Dec 5, 2022
d7db32b
update doc
yeliang2258 Dec 5, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README_CN.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@

<div id="fastdeploy-quick-start-python"></div>

<details close>
<details close>

<summary><b>Python SDK快速开始(点开查看详情)</b></summary><div>

#### 快速安装
Expand Down Expand Up @@ -131,7 +131,7 @@ cv2.imwrite("vis_image.jpg", vis_im)
<div id="fastdeploy-quick-start-cpp"></div>

<details close>

<summary><b>C++ SDK快速开始(点开查看详情)</b></summary><div>


Expand Down
Empty file modified README_EN.md
100644 → 100755
Empty file.
2 changes: 1 addition & 1 deletion cmake/paddlelite.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ else() # Linux
if(CMAKE_HOST_SYSTEM_PROCESSOR MATCHES "aarch64")
set(PADDLELITE_URL "${PADDLELITE_URL_PREFIX}/lite-linux-arm64-20220920.tgz")
elseif(TARGET_ABI MATCHES "armhf")
set(PADDLELITE_URL "https://bj.bcebos.com/fastdeploy/test/lite-linux_armhf_1101.tgz")
set(PADDLELITE_URL "https://bj.bcebos.com/fastdeploy/test/lite-linux_armhf_1130.tgz")
yeliang2258 marked this conversation as resolved.
Show resolved Hide resolved
else()
message(FATAL_ERROR "Only support Linux aarch64 now, x64 is not supported with backend Paddle Lite.")
endif()
Expand Down
5 changes: 3 additions & 2 deletions docs/cn/build_and_install/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,10 @@
## 自行编译安装
- [GPU部署环境](gpu.md)
- [CPU部署环境](cpu.md)
- [CPU部署环境](ipu.md)
- [IPU部署环境](ipu.md)
- [Jetson部署环境](jetson.md)
- [Android平台部署环境](android.md)
- [瑞芯微RK1126部署环境](rk1126.md)
- [瑞芯微RV1126部署环境](rv1126.md)


## FastDeploy编译选项说明
Expand All @@ -22,6 +22,7 @@
| ENABLE_PADDLE_BACKEND | 默认OFF,是否编译集成Paddle Inference后端(CPU/GPU上推荐打开) |
| ENABLE_LITE_BACKEND | 默认OFF,是否编译集成Paddle Lite后端(编译Android库时需要设置为ON) |
| ENABLE_RKNPU2_BACKEND | 默认OFF,是否编译集成RKNPU2后端(RK3588/RK3568/RK3566上推荐打开) |
| ENABLE_TIMVX | 默认OFF,需要在RV1126/RV1109上部署时,需设置为ON |
| ENABLE_TRT_BACKEND | 默认OFF,是否编译集成TensorRT后端(GPU上推荐打开) |
| ENABLE_OPENVINO_BACKEND | 默认OFF,是否编译集成OpenVINO后端(CPU上推荐打开) |
| ENABLE_VISION | 默认OFF,是否编译集成视觉模型的部署模块 |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# 瑞芯微 RK1126 部署环境编译安装
# 瑞芯微 RV1126 部署环境编译安装

FastDeploy基于 Paddle-Lite 后端支持在瑞芯微(Rockchip)Soc 上进行部署推理。
更多详细的信息请参考:[PaddleLite部署示例](https://paddle-lite.readthedocs.io/zh/develop/demo_guides/verisilicon_timvx.html)。
Expand Down Expand Up @@ -54,10 +54,10 @@ cmake -DCMAKE_TOOLCHAIN_FILE=./../cmake/timvx.cmake \
-DENABLE_VISION=ON \ # 是否编译集成视觉模型的部署模块,可选择开启
-Wno-dev ..

# Build FastDeploy RK1126 C++ SDK
# Build FastDeploy RV1126 C++ SDK
make -j8
make install
```
编译完成之后,会生成 fastdeploy-tmivx 目录,表示基于 PadddleLite TIM-VX 的 FastDeploy 库编译完成。

RK1126 上部署 PaddleClas 分类模型请参考:[PaddleClas RK1126开发板 C++ 部署示例](../../../examples/vision/classification/paddleclas/rk1126/README.md)
RV1126 上部署 PaddleClas 分类模型请参考:[PaddleClas RV1126开发板 C++ 部署示例](../../../examples/vision/classification/paddleclas/rv1126/README.md)
yeliang2258 marked this conversation as resolved.
Show resolved Hide resolved
11 changes: 0 additions & 11 deletions examples/vision/classification/paddleclas/rk1126/README.md

This file was deleted.

11 changes: 11 additions & 0 deletions examples/vision/classification/paddleclas/rv1126/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# PaddleClas 量化模型在 RV1126 上的部署
目前 FastDeploy 已经支持基于 PaddleLite 部署 PaddleClas 量化模型到 RV1126 上。

模型的量化和量化模型的下载请参考:[模型量化](../quantize/README.md)


## 详细部署文档

在 RV/home/yeliang/develop/FastDeploy/examples/vision/classification/paddleclas/rv1126/README.md1126 上只支持 C++ 的部署。
yeliang2258 marked this conversation as resolved.
Show resolved Hide resolved

- [C++部署](cpp)
Original file line number Diff line number Diff line change
@@ -1,22 +1,22 @@
# PaddleClas RK1126开发板 C++ 部署示例
本目录下提供的 `infer.cc`,可以帮助用户快速完成 PaddleClas 量化模型在 RK1126 上的部署推理加速。
# PaddleClas rv1126开发板 C++ 部署示例
本目录下提供的 `infer.cc`,可以帮助用户快速完成 PaddleClas 量化模型在 rv1126 上的部署推理加速。

## 部署准备
### FastDeploy 交叉编译环境准备
- 1. 软硬件环境满足要求,以及交叉编译环境的准备,请参考:[FastDeploy 交叉编译环境准备](../../../../../../docs/cn/build_and_install/rk1126.md#交叉编译环境搭建)
- 1. 软硬件环境满足要求,以及交叉编译环境的准备,请参考:[FastDeploy 交叉编译环境准备](../../../../../../docs/cn/build_and_install/rv1126.md#交叉编译环境搭建)

### 量化模型准备
- 1. 用户可以直接使用由 FastDeploy 提供的量化模型进行部署。
- 2. 用户可以使用 FastDeploy 提供的[一键模型自动化压缩工具](../../../../../../tools/auto_compression/),自行进行模型量化, 并使用产出的量化模型进行部署。(注意: 推理量化后的分类模型仍然需要FP32模型文件夹下的inference_cls.yaml文件, 自行量化的模型文件夹内不包含此 yaml 文件, 用户从 FP32 模型文件夹下复制此 yaml 文件到量化后的模型文件夹内即可.)
- 更多量化相关相关信息可查阅[模型量化](../../quantize/README.md)

## 在 RK1126 上部署量化后的 ResNet50_Vd 分类模型
请按照以下步骤完成在 RK1126 上部署 ResNet50_Vd 量化模型:
1. 交叉编译编译 FastDeploy 库,具体请参考:[交叉编译 FastDeploy](../../../../../../docs/cn/build_and_install/rk1126.md#基于-paddlelite-的-fastdeploy-交叉编译库编译)
## 在 rv1126 上部署量化后的 ResNet50_Vd 分类模型
请按照以下步骤完成在 rv1126 上部署 ResNet50_Vd 量化模型:
1. 交叉编译编译 FastDeploy 库,具体请参考:[交叉编译 FastDeploy](../../../../../../docs/cn/build_and_install/rv1126.md#基于-paddlelite-的-fastdeploy-交叉编译库编译)

2. 将编译后的库拷贝到当前目录,可使用如下命令:
```bash
cp -r FastDeploy/build/fastdeploy-tmivx/ FastDeploy/examples/vision/classification/paddleclas/rk1126/cpp/
cp -r FastDeploy/build/fastdeploy-tmivx/ FastDeploy/examples/vision/classification/paddleclas/rv1126/cpp/
```

3. 在当前路径下载部署所需的模型和示例图片:
Expand All @@ -41,7 +41,7 @@ make install
5. 基于 adb 工具部署 ResNet50_vd 分类模型到 Rockchip RV1126,可使用如下命令:
```bash
# 进入 install 目录
cd FastDeploy/examples/vision/classification/paddleclas/rk1126/cpp/build/install/
cd FastDeploy/examples/vision/classification/paddleclas/rv1126/cpp/build/install/
# 如下命令表示:bash run_with_adb.sh 需要运行的demo 模型路径 图片路径 设备的DEVICE_ID
bash run_with_adb.sh infer_demo ResNet50_vd_infer ILSVRC2012_val_00000010.jpeg $DEVICE_ID
```
Expand All @@ -50,4 +50,4 @@ bash run_with_adb.sh infer_demo ResNet50_vd_infer ILSVRC2012_val_00000010.jpeg $

<img width="640" src="https://user-images.githubusercontent.com/30516196/200767389-26519e50-9e4f-4fe1-8d52-260718f73476.png">

需要特别注意的是,在 RK1126 上部署的模型需要是量化后的模型,模型的量化请参考:[模型量化](../../../../../../docs/cn/quantize.md)
需要特别注意的是,在 rv1126 上部署的模型需要是量化后的模型,模型的量化请参考:[模型量化](../../../../../../docs/cn/quantize.md)

This file was deleted.

This file was deleted.

Loading