-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Register optimizers in a centralized location #7157
Merged
mattsoulanille
merged 1 commit into
tensorflow:master
from
mattsoulanille:optimizer_registration_central
Dec 12, 2022
Merged
Register optimizers in a centralized location #7157
mattsoulanille
merged 1 commit into
tensorflow:master
from
mattsoulanille:optimizer_registration_central
Dec 12, 2022
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mattsoulanille
force-pushed
the
optimizer_registration_central
branch
from
December 9, 2022 23:05
f7ed2ce
to
a1dcb23
Compare
pyu10055
approved these changes
Dec 9, 2022
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed 11 of 11 files at r1, all commit messages.
Reviewable status: complete! 1 of 1 approvals obtained (waiting on @Linchenn)
Linchenn
approved these changes
Dec 9, 2022
mattsoulanille
force-pushed
the
optimizer_registration_central
branch
from
December 10, 2022 01:22
a1dcb23
to
b68b939
Compare
Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json. This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function.
mattsoulanille
force-pushed
the
optimizer_registration_central
branch
from
December 12, 2022 18:03
b68b939
to
d69ec8d
Compare
AdamLang96
added a commit
to CodeSmithDSMLProjects/tfjs
that referenced
this pull request
Dec 13, 2022
* started resize bicubic * started padding algorithm for bicubic forward pass in cpu backend * started padding algorithm for bicubic forward pass in cpu backend * Mark all calls to 'op()' as pure (tensorflow#7155) Mark calls to the `op()` function that creates the exported op as pure by using [`/* @__PURE__ */` annotations](https://esbuild.github.io/api/#ignore-annotations) (this also works for Rollup, but I can't find the docs). This comment instructs bundlers that the function call has no side-effects, so it can be removed if the result is not used. This is okay for the `op` function because, although it references ENGINE, it does so [in a closure](https://github.com/tensorflow/tfjs/blob/master/tfjs-core/src/ops/operation.ts#L48-L61) that it never calls, so while its return value may cause side effects when called, it itself does not. This has no immediate effect because we still maintain a list of `sideEffects` in the package.json, but it is a step towards removing that list. Co-authored-by: Linchenn <40653845+Linchenn@users.noreply.github.com> * need to fix padding algo * Update rules_python to 0.16.1 (tensorflow#7160) This update includes new lock files for pypi packages that make sure their versions don't change between builds. These lock files can be generated with the update_locked_deps.sh script. As part of this update, the PR pins flax to 0.6.2. Additionally, python dependencies will only be fetched when a build requires them, so first time javascript-only builds should see a speedup. * Register optimizers in a centralized location (tensorflow#7157) Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json. This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function. * Simplify how Optimizers are re-exported in train.ts (tensorflow#7156) `train.ts` exports optimizers by copying them from the `OptimizerConstructors` class onto a `train` object. This is unnecessary because the `OptimizerConstructors` class constructor is a subtype of the `train` object's type (i.e. it has all the properties that `train` has). Instead of creating a new `train` object, this PR re-exports `OptimizerConstructors` as `train`. This has no direct effect now, but if / when re remove the `sideEffects` field from `package.json`, it helps some bundlers (esbuild) do tree-shaking. * Use static getters to get optimizer class names (tensorflow#7168) Each `Optimizer` lists its class name as a static property of the class so it can be serialized and deserialized. This prevents the class from being tree-shaken because bundlers will compile it like this: ``` class SomeOptimizer { ... } // The bundler can not remove this assignment because // SomeOptimizer.className could be a setter with a side effect. SomeOptimizer.className = 'SomeOptimizer'; ``` This PR uses a static getter for the class name instead, which bundlers can tree-shake properly. * need corners * padding is functional * debugging padding tool for multiple channels Co-authored-by: Matthew Soulanille <msoulanille@google.com> Co-authored-by: Linchenn <40653845+Linchenn@users.noreply.github.com>
Linchenn
pushed a commit
to Linchenn/tfjs
that referenced
this pull request
Jan 9, 2023
Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json. This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when
sideEffects
is removed from the package.json.This PR moves Optimizer registration to the
register_optimizers.ts
file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, theregisterOptimizers
function is called fromindex.ts
. Custom bundles replaceindex.ts
with a different file that does not call this function.To see the logs from the Cloud Build CI, please join either our discussion or announcement mailing list.
This change is