Skip to content

Softsign activation function #18

Closed
jeffin07 wants to merge 2 commits intoddbourgin:masterfrom
jeffin07:softsign
Closed

Softsign activation function #18
jeffin07 wants to merge 2 commits intoddbourgin:masterfrom
jeffin07:softsign

Conversation

@jeffin07
Copy link
Contributor

Implemented Softsign activation and unit test for sotfsign
plot

files changed

  • activations.py
  • test.py
    -../tests/tests.py

@ddbourgin
Copy link
Owner

ddbourgin commented Jul 14, 2019

Thanks, @jeffin07! I'm less inclined to keep adding activation functions, since I think we've already covered the majority of nonlinearities used in modern deep learning. If there's a compelling reason to add soft-sign (e.g., a paper / architecture that reports good results using soft-sign), let me know, but otherwise I think we're probably best keeping things as they are.

@jeffin07
Copy link
Contributor Author

@ddbourgin i didn't see you closed #7 :) . I was thinking of doing a loss function do you have any suggestions ?

@ddbourgin
Copy link
Owner

That sounds great! Perhaps a cosine or KL-divergence loss? Depending on your enthusiasm, more sophisticated things like triplet loss or connectionist temporal classification loss would be awesome, though there's a reason why I've put them off ;-)

@jeffin07
Copy link
Contributor Author

@ddbourgin thats great will close this PR now :)

@jeffin07 jeffin07 closed this Jul 16, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments