Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Softsign activation function #18

Closed
wants to merge 2 commits into from
Closed

Softsign activation function #18

wants to merge 2 commits into from

Conversation

jeffin07
Copy link
Contributor

Implemented Softsign activation and unit test for sotfsign
plot

files changed

  • activations.py
  • test.py
    -../tests/tests.py

@ddbourgin
Copy link
Owner

ddbourgin commented Jul 14, 2019

Thanks, @jeffin07! I'm less inclined to keep adding activation functions, since I think we've already covered the majority of nonlinearities used in modern deep learning. If there's a compelling reason to add soft-sign (e.g., a paper / architecture that reports good results using soft-sign), let me know, but otherwise I think we're probably best keeping things as they are.

@jeffin07
Copy link
Contributor Author

@ddbourgin i didn't see you closed #7 :) . I was thinking of doing a loss function do you have any suggestions ?

@ddbourgin
Copy link
Owner

That sounds great! Perhaps a cosine or KL-divergence loss? Depending on your enthusiasm, more sophisticated things like triplet loss or connectionist temporal classification loss would be awesome, though there's a reason why I've put them off ;-)

@jeffin07
Copy link
Contributor Author

@ddbourgin thats great will close this PR now :)

@jeffin07 jeffin07 closed this Jul 16, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants