Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference example for image_classification and unit_test for "inference" #8020

Merged
merged 6 commits into from
Feb 6, 2018

Conversation

sidgoyal78
Copy link
Contributor

@sidgoyal78 sidgoyal78 commented Feb 1, 2018

This addresses one of the 2 tasks in #7999. With the new changes from #7995, it is now running successfully.

It checks for both vgg and resnet versions.

EDIT: removed previously pasted log

@Xreki Xreki added the 预测 原名Inference,包含Capi预测问题等 label Feb 1, 2018
@kexinzhao kexinzhao added this to DOING in Inference Framework Feb 1, 2018
@Xreki
Copy link
Contributor

Xreki commented Feb 5, 2018

@sidgoyal78 will you update this PR? Since you said #7995 can resolve the problem of PR and #7995 is merged. Also, there are conflicts with the develop branch.

@sidgoyal78
Copy link
Contributor Author

sidgoyal78 commented Feb 5, 2018

Yes, I have updated the PR. Now, this works.

fetch_targets] = fluid.io.load_inference_model(save_dirname, exe)

# The input's dimension of conv should be 4-D or 5-D.
tensor_img = numpy.random.rand(1, 3, 32, 32).astype("float32")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the range of input data?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if acc_value > 0.01: # Low threshold for speeding up
fluid.io.save_inference_model(save_dirname, ["pixel"],
[predict], exe)
early_terminate = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can change the origin main to train, and define another main function to run train and infer, so that we can use return here and do not need the boolean early_terminate.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

fetch_list=[avg_cost, acc])
acc_list.append(float(acc_t))
avg_loss_list.append(float(loss_t))
break # just use 1 segment for now
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change the comment to : use 1 segment for speeding up CI

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks.

Copy link
Contributor

@Xreki Xreki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Xreki Xreki merged commit 78949c0 into PaddlePaddle:develop Feb 6, 2018
@Xreki Xreki moved this from DOING to DONE in Inference Framework Feb 7, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
No open projects
Inference Framework
Basic Usage (DONE)
Development

Successfully merging this pull request may close these issues.

None yet

2 participants