Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multi-thread inference example which shares the inference_program and parameters #9302

Merged
merged 12 commits into from
Apr 9, 2018

Conversation

Xreki
Copy link
Contributor

@Xreki Xreki commented Mar 21, 2018

Fix #9650

@Xreki Xreki added the 预测 原名Inference,包含Capi预测问题等 label Mar 21, 2018
@Xreki Xreki added this to Basic Usage (DOING) in Inference Framework Apr 3, 2018
@Xreki Xreki force-pushed the core_inference_multi_thread branch from a728656 to d089deb Compare April 4, 2018 06:13
@Xreki Xreki force-pushed the core_inference_multi_thread branch from 8de6563 to 208fcf5 Compare April 8, 2018 03:20
@@ -55,6 +56,9 @@ class ProgramDesc {
const std::vector<std::string> GetFeedTargetNames();
const std::vector<std::string> GetFetchTargetNames();

void SetFeedHolderName(const std::string &feed_holder_name);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is feed/fetch holder a new concept? add some comments will be better.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

}

template <typename Place>
void TestMultiThreadInference(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

请问TestMultiThreadInference 为什么没和 TestInference一样,放在paddle/fluid/inference/tests/test_helper.h里呢?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

多线程作为一种特殊的应用场景,放在一个单独的文件里面,方便索引和引用吧。

TestInference<paddle::platform::CUDAPlace>(dirname, cpu_feeds, cpu_fetchs2);
LOG(INFO) << output2.dims();

CheckError<float>(output1, output2);
#endif
}

TEST(multi_thread_inference, fit_a_line) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TEST(inference, fit_a_line)TEST(multi_thread_inference, fit_a_line) 的代码非常类似,前者能否作为后者的一个特例num_threads=1呢?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@Xreki Xreki force-pushed the core_inference_multi_thread branch from 649e9ed to 720f619 Compare April 9, 2018 08:51
Copy link
Contributor

@luotao1 luotao1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@luotao1 luotao1 merged commit ddff83f into PaddlePaddle:develop Apr 9, 2018
@Superjomn Superjomn moved this from Basic Usage (DOING) to Basic Usage (DONE) in Inference Framework Apr 11, 2018
@Xreki Xreki deleted the core_inference_multi_thread branch October 29, 2019 00:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
No open projects
Inference Framework
Basic Usage (DONE)
Development

Successfully merging this pull request may close these issues.

None yet

3 participants