Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix: Drop "re-throw" in MLActivation creation steps #619

Merged

Conversation

inexorabletash
Copy link
Member

@inexorabletash inexorabletash commented Mar 22, 2024

The "create an MLActivation" steps don't throw, so there's no need to re-throw.

Also, nothing specifies init-steps so drop those, and simplify how the steps handle options, to align with how things are done for MLOperands.

Discussed in #591 (comment)


Preview | Diff

The "create an MLActivation" steps don't throw, so there's no
need to re-throw.

Discussed in webmachinelearning#591 (comment)
Copy link
Contributor

@huningxin huningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with two questions:

https://www.w3.org/TR/webnn/#create-an-mlactivation has a step to run init-steps if it presents

If init-steps are given, run init-steps with options.

I am not sure whether it would throw. However, it looks like there are no any call sites that supply the init-steps, should we remove this step accordingly?

The following step:

Otherwise, initialize activation.[[operator]] given options in an implementation-defined way for the given name operation.

I am not sure whether this step would fail. However, as #606 drops the platform operand and operator, should we drop this step as well?

@inexorabletash
Copy link
Member Author

I don't know the context around init-steps. Are there plans to add them? Or was it an idea that turned out not to be needed?

Either way - can either merge this and follow up, or I can revise - either way, this is not blocking anything else.

@huningxin
Copy link
Contributor

@inexorabletash

Are there plans to add them? Or was it an idea that turned out not to be needed?

I suppose the latter. I dig the commits, and found it was introduced by @zolkis in 30ba12e. I try to recall and it seems to be intended to run the activation initialization steps supplied by different activation creation caller. But it turns out there are no callers using it. @zolkis , please correct me if I am wrong.

Either way - can either merge this and follow up, or I can revise - either way, this is not blocking anything else.

This PR proposes to remove the "re-throw" because "create an MLActivation" steps don't throw. I think we may need to ensure all "create an MLActivation" steps won't throw. So, if "init-steps" is useful and may throw, the "re-throw" may need tol be kept. Otherwise, the "init-steps" should be removed and that makes "create an MLActivation" steps don't throw at all, then the "re-throw" could be removed. WDYT?

Also, simplify the use of options and operator, aligning with how
MLOperands are created.
@inexorabletash
Copy link
Member Author

inexorabletash commented Mar 23, 2024

This PR proposes to remove the "re-throw" because "create an MLActivation" steps don't throw. I think we may need to ensure all "create an MLActivation" steps won't throw. So, if "init-steps" is useful and may throw, the "re-throw" may need tol be kept. Otherwise, the "init-steps" should be removed and that makes "create an MLActivation" steps don't throw at all, then the "re-throw" could be removed. WDYT?

Makes sense. I removed the steps in c6ee63a, but if we decide to keep them or re-add them then we can abandon this PR and do one instead that ensures all invocations re-throw, and make it explicit that init-steps themselves may throw and that methods that invoke MLActivation creation should re-throw.

@zolkis
Copy link
Collaborator

zolkis commented Mar 25, 2024

Back then we wanted to keep optional init steps around until we figured how we end up specifying other builder algorithms.
Since now the usage has became more clear, it's indeed time to simplify and LGTM.

Copy link
Contributor

@huningxin huningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@huningxin huningxin merged commit 2f32429 into webmachinelearning:main Mar 25, 2024
2 checks passed
github-actions bot added a commit that referenced this pull request Mar 25, 2024
SHA: 2f32429
Reason: push, by huningxin

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@inexorabletash inexorabletash deleted the bugfix-activation-vending branch March 25, 2024 15:51
github-actions bot added a commit to zolkis/webnn that referenced this pull request Mar 25, 2024
…activation-vending

SHA: 2f32429
Reason: push, by zolkis

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants