Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to add SoftClipping and SmoothMax #26259

Open
dmilos opened this issue Mar 1, 2019 · 6 comments
Open

Any plan to add SoftClipping and SmoothMax #26259

dmilos opened this issue Mar 1, 2019 · 6 comments
Labels
comp:ops OPs related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:contribution welcome Status - Contributions welcome type:feature Feature requests

Comments

@dmilos
Copy link

dmilos commented Mar 1, 2019

Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template

System information

  • TensorFlow version (you are using):
  • Are you willing to contribute it (Yes/No):
    Yes.
    Describe the feature and the current behavior/state.
    SoftClipping for activation. [https://en.wikipedia.org/wiki/Activation_function]
    SmootMax for pooling. [https://en.wikipedia.org/wiki/Smooth_maximum]
    Will this change the current api? How?
    No. This is new feature.
    Who will benefit with this feature?
    Everybody.
    SmoothClipping is smoot function and can not suffer from vanishing/exploding gradients.
    So with this we will have faster learning. Create Soft Clipping Layer tiny-dnn/tiny-dnn#1014. Only 10 layers achieve 99.35% on MNIST set, Comparing to original 12 and 99.0%
    Any Other info.
@jvishnuvardhan jvishnuvardhan self-assigned this Mar 1, 2019
@jvishnuvardhan jvishnuvardhan added type:feature Feature requests comp:ops OPs related issues labels Mar 1, 2019
@jvishnuvardhan jvishnuvardhan added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Mar 1, 2019
@rmothukuru rmothukuru self-assigned this May 26, 2021
@rmothukuru
Copy link
Contributor

@dmilos,
Sorry for the delayed response. Relu Activation Function seems to be good for handling the problem of Vanishing and Exploding Gradients.

Can you please let us know how efficient are the activation functions that you proposed compared to the existing non-saturated activation functions like Relu, Elu, Selu? Thanks!

@rmothukuru rmothukuru added stat:awaiting response Status - Awaiting response from author and removed stat:awaiting tensorflower Status - Awaiting response from tensorflower labels May 26, 2021
@dmilos
Copy link
Author

dmilos commented May 26, 2021

SmoothClipping is smooth function and can not suffer from vanishing/exploding gradients.
So with this we will have faster learning. tiny-dnn/tiny-dnn#1014. Only 10 layers achieve 99.35% on MNIST set, Comparing to original 12 and 99.0%

@dmilos dmilos changed the title Any plan to add SoftClippingand SmoothMax Any plan to add SoftClipping and SmoothMax May 30, 2021
@dmilos
Copy link
Author

dmilos commented May 30, 2021

For more details in depth for Soft Clipping see:
Neural Network-Based Approach to Phase Space Integration
M. D. Klimek1,2*, M. Perelstein1
1 Laboratory for Elementary Particle Physics, Cornell University, Ithaca, NY, USA
2 Department of Physics, Korea University, Seoul, Republic of Korea

@rmothukuru rmothukuru removed the stat:awaiting response Status - Awaiting response from author label May 31, 2021
@rmothukuru rmothukuru removed their assignment May 31, 2021
@rmothukuru rmothukuru added the stat:contribution welcome Status - Contributions welcome label Jun 16, 2021
@github-actions
Copy link

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Mar 28, 2023
@dmilos
Copy link
Author

dmilos commented Mar 28, 2023

Appears that answer is: no.
Thanks.

@google-ml-butler google-ml-butler bot removed the stale This label marks the issue/pr stale - to be closed automatically if no activity label Mar 28, 2023
@github-actions
Copy link

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Sep 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:ops OPs related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:contribution welcome Status - Contributions welcome type:feature Feature requests
Projects
None yet
Development

No branches or pull requests

4 participants