Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable is_test attr of batch norm and drop out op for test program #8642

Merged
merged 6 commits into from
Mar 6, 2018

Conversation

kexinzhao
Copy link
Contributor

fix #8372

@kexinzhao kexinzhao requested a review from Xreki February 28, 2018 06:54
@kexinzhao kexinzhao added the 预测 原名Inference,包含Capi预测问题等 label Feb 28, 2018
@kexinzhao kexinzhao added this to Basic Usage (DOING) in Inference Framework Mar 1, 2018
Copy link
Contributor

@Xreki Xreki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work!

@@ -186,9 +186,7 @@ void Prune(const proto::ProgramDesc& input, proto::ProgramDesc* output) {
prune_impl(input, output, 0, -1, dependent_vars);
}

void inference_optimize_impl(const proto::ProgramDesc& input,
proto::ProgramDesc* output, int block_id) {
*output = input;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

output -> inout

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@@ -205,7 +203,12 @@ void inference_optimize_impl(const proto::ProgramDesc& input,

Copy link
Contributor

@Xreki Xreki Mar 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这么看,原来的inference_program已经有这个修改了啊,那原来是我没注意到。我理解,无论是哪个op,只要有is_test这个属性,inference_program里面都应该设置成true才对,line 192 - 193op type的判断是否可以去掉?

Copy link
Contributor Author

@kexinzhao kexinzhao Mar 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

是的,生成inference_program的时候就经过了inference_optimize函数的处理了。
你说的很有道理,已修改~

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以删除line 30 - 31

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -205,7 +203,12 @@ void inference_optimize_impl(const proto::ProgramDesc& input,

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以删除line 30 - 31

@@ -115,7 +115,7 @@ def train(net_type, use_cuda, save_dirname, is_local):
acc = fluid.layers.accuracy(input=predict, label=label)

# Test program
test_program = fluid.default_main_program().clone()
test_program = fluid.default_main_program().clone_as_test_program()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

最好是所有的生成test_program都改一下?不然用户容易迷惑。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

所有三个生成test_program的地方都改了~

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

哦,我原以为所有的示例都有test呢。。。

p.sync_with_cpp()
p.copy_param_info_from(self)
return p

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

或许我们可以给clone函数加个参数,变成clone(for_test=False),默认值为False,这样可以避免重复的代码实现,你觉得呢?另外需要加一些注释说明一下。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't agree more. Done.

Copy link
Contributor

@Xreki Xreki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks for your work.

@Xreki Xreki merged commit 6720681 into PaddlePaddle:develop Mar 6, 2018
@kexinzhao kexinzhao moved this from Basic Usage (DOING) to Basic Usage (DONE) in Inference Framework Mar 6, 2018
@kexinzhao kexinzhao deleted the fix_is_test branch April 4, 2018 02:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
No open projects
Inference Framework
Basic Usage (DONE)
Development

Successfully merging this pull request may close these issues.

Need to change the attribute is_test of batch_norm op to true in test_program and inference_program
2 participants