Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

Update the links of documentations in Paddle repo. #24

Merged
merged 1 commit into from
Nov 7, 2017
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
Here mainly describes how to deploy PaddlePaddle to the mobile end, as well as some deployment optimization methods and some benchmark.

## How to build PaddlePaddle for mobile
- [Build PaddlePaddle for Android](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/cross_compiling/cross_compiling_for_android_cn.md)
- Build PaddlePaddle for IOS
- [Build PaddlePaddle for Raspberry Pi3](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/cross_compiling/cross_compiling_for_raspberry_cn.md)
- Build PaddlePaddle for Android [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_android_cn.md)] [[English](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_android_en.md)]
- Build PaddlePaddle for IOS [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_ios_cn.md)]
- Build PaddlePaddle for Raspberry Pi3 [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_raspberry_cn.md)] [[English](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_raspberry_en.md)]
- Build PaddlePaddle for PX2
- How to build PaddlePaddle mobile inference library with minimum size.

Expand All @@ -15,7 +15,7 @@ Here mainly describes how to deploy PaddlePaddle to the mobile end, as well as s
## Deployment optimization methods
- [Merge batch normalization before deploying the model to the mobile.](./tools/merge_batch_normalization/README.md)
- [Compress the model before deploying the model to the mobile.](./tools/rounding/README.md)
- [Merge model config and parameter files into one file.](./tools/merge_config_paramsters/README.md)
- [Merge model config and parameter files into one file.](./tools/merge_config_parameters/README.md)
- How to deploy int8 model in mobile inference with PaddlePaddle.

## Model compression
Expand Down