Skip to content

Conversation

jking-ca
Copy link

Summary:
Adding sparse L1 and L2 regularization operator to Caffe2. This doesn't work using run_on_loss, only run_after_optimize. Applying it to run_after_optimize rather than run_on_loss was easier to implement, particularly for the L1 norm which is preferable in some cases and is non-differentiable at zero.

This diff has been copied from D6735673 and modified.

Test Plan:
Wrote and ran unit tests:

buck test mode/dev //caffe2/caffe2/python/operator_test:sparse_lp_regularizer_test

Differential Revision: D21003029

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D21003029

@dr-ci
Copy link

dr-ci bot commented May 15, 2020

💊 CI failures summary and remediations

As of commit 99eb251 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 14 times.

@jking-ca jking-ca force-pushed the export-D21003029 branch from 8c38a0f to 2b47eed Compare May 15, 2020 23:05
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D21003029

@jking-ca jking-ca force-pushed the export-D21003029 branch from 2b47eed to 85f10a5 Compare May 26, 2020 04:51
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D21003029

Summary:
Pull Request resolved: pytorch#38574

Adding sparse L1 and L2 regularization operator to Caffe2.  This doesn't work using run_on_loss, only run_after_optimize.  Applying it to run_after_optimize rather than run_on_loss was easier to implement, particularly for the L1 norm which is preferable in some cases and is non-differentiable at zero.

Test Plan: Wrote and ran unit tests in operator_test:sparse_lp_regularizer_test.

Differential Revision: D21003029

fbshipit-source-id: 2a82b76e8c349fbb7bd2692ecb39bd7d395cbeec
@jking-ca jking-ca force-pushed the export-D21003029 branch from 85f10a5 to 99eb251 Compare May 26, 2020 05:37
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D21003029

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 7f1a96d.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants