New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any plan to add SoftClipping and SmoothMax #26259
Comments
@dmilos, Can you please let us know how efficient are the |
SmoothClipping is smooth function and can not suffer from vanishing/exploding gradients. |
For more details in depth for Soft Clipping see:
|
This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you. |
Appears that answer is: no. |
This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you. |
Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template
System information
Yes.
Describe the feature and the current behavior/state.
SoftClipping for activation. [https://en.wikipedia.org/wiki/Activation_function]
SmootMax for pooling. [https://en.wikipedia.org/wiki/Smooth_maximum]
Will this change the current api? How?
No. This is new feature.
Who will benefit with this feature?
Everybody.
SmoothClipping is smoot function and can not suffer from vanishing/exploding gradients.
So with this we will have faster learning. Create Soft Clipping Layer tiny-dnn/tiny-dnn#1014. Only 10 layers achieve 99.35% on MNIST set, Comparing to original 12 and 99.0%
Any Other info.
The text was updated successfully, but these errors were encountered: