Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace ErrorCode with GlowErrs #2509

Merged
merged 2 commits into from Mar 12, 2019

Conversation

Projects
None yet
3 participants
@jackm321
Copy link
Contributor

jackm321 commented Mar 9, 2019

Description:
Finish migration from ErrorCode to GlowErr in Glow runtime. The last step was migrating ResultCBTy to include an llvm::Error which meant some changes to the state that Executor maintains per-run.

Testing:
Ninja check
Run resnet-runtime example binary.

Documentation:
Comments

@jackm321 jackm321 requested review from SplitInfinity and gcatron Mar 9, 2019

@jackm321 jackm321 force-pushed the jackm321:no_more_ResultCode branch 2 times, most recently from e37e92e to deeb589 Mar 9, 2019

@jackm321 jackm321 force-pushed the jackm321:no_more_ResultCode branch 6 times, most recently from acfe2be to e737099 Mar 9, 2019

@gcatron
Copy link
Contributor

gcatron left a comment

One comment on an exit_on_err, otherwise LGTM.

@@ -48,7 +48,7 @@ void ExecutionEngine::setBackend(Backend *backend, bool ownsBackend) {

if (differentKinds) {
if (device_) {
device_->stop();
EXIT_ON_ERR(device_->stop());

This comment has been minimized.

@gcatron

gcatron Mar 11, 2019

Contributor

Would it be better to return an error from set Backend and handle it rather than exit on err? This approach means an error here will cancel a unit test mid run. Perhaps a seperate issue since line 59 also calls exit on err?

This comment has been minimized.

@jackm321

jackm321 Mar 11, 2019

Author Contributor

Yeah I dunno, this is somewhat philosophical, I think it's easier to just have ExecutionEngine die on any errors (it's a debug runner thing) than go through to all of the unit tests that use ExecutionEngine and add error handling there. If we want to go that route we can but I'm not sure it's worth the change. What do you think?

This comment has been minimized.

@gcatron

gcatron Mar 11, 2019

Contributor

I think long term it would be better to have the plumbing but it's tangential to this PR. Could you write up an issue for the future work?

@gcatron

This comment has been minimized.

Copy link
Contributor

gcatron commented Mar 11, 2019

Approved, probably good to have @SplitInfinity look over the executor as well though 😃

@jackm321 jackm321 force-pushed the jackm321:no_more_ResultCode branch from e737099 to 2514a87 Mar 11, 2019

@jackm321

This comment has been minimized.

Copy link
Contributor Author

jackm321 commented Mar 12, 2019

Talked to @SplitInfinity offline and he said good to go.

@jackm321 jackm321 merged commit c2b3866 into pytorch:master Mar 12, 2019

6 checks passed

ci/circleci: ASAN Your tests passed on CircleCI!
Details
ci/circleci: DEBUG Your tests passed on CircleCI!
Details
ci/circleci: RELEASE_WITH_EXPENSIVE_TESTS Your tests passed on CircleCI!
Details
ci/circleci: SHARED Your tests passed on CircleCI!
Details
ci/circleci: TSAN Your tests passed on CircleCI!
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.