Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

Commit

Permalink
Merge pull request #24 from Xreki/update_links
Browse files Browse the repository at this point in the history
Update the links of documentations in Paddle repo.
  • Loading branch information
Xreki committed Nov 7, 2017
2 parents cced3df + c0698e3 commit 8833a70
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
Here mainly describes how to deploy PaddlePaddle to the mobile end, as well as some deployment optimization methods and some benchmark.

## How to build PaddlePaddle for mobile
- [Build PaddlePaddle for Android](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/cross_compiling/cross_compiling_for_android_cn.md)
- Build PaddlePaddle for IOS
- [Build PaddlePaddle for Raspberry Pi3](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/cross_compiling/cross_compiling_for_raspberry_cn.md)
- Build PaddlePaddle for Android [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_android_cn.md)] [[English](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_android_en.md)]
- Build PaddlePaddle for IOS [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_ios_cn.md)]
- Build PaddlePaddle for Raspberry Pi3 [[Chinese](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_raspberry_cn.md)] [[English](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/mobile/cross_compiling_for_raspberry_en.md)]
- Build PaddlePaddle for PX2
- How to build PaddlePaddle mobile inference library with minimum size.

Expand All @@ -15,7 +15,7 @@ Here mainly describes how to deploy PaddlePaddle to the mobile end, as well as s
## Deployment optimization methods
- [Merge batch normalization before deploying the model to the mobile.](./tools/merge_batch_normalization/README.md)
- [Compress the model before deploying the model to the mobile.](./tools/rounding/README.md)
- [Merge model config and parameter files into one file.](./tools/merge_config_paramsters/README.md)
- [Merge model config and parameter files into one file.](./tools/merge_config_parameters/README.md)
- How to deploy int8 model in mobile inference with PaddlePaddle.

## Model compression
Expand Down

0 comments on commit 8833a70

Please sign in to comment.