Softsign activation function #18
Softsign activation function #18jeffin07 wants to merge 2 commits intoddbourgin:masterfrom jeffin07:softsign
Conversation
|
Thanks, @jeffin07! I'm less inclined to keep adding activation functions, since I think we've already covered the majority of nonlinearities used in modern deep learning. If there's a compelling reason to add soft-sign (e.g., a paper / architecture that reports good results using soft-sign), let me know, but otherwise I think we're probably best keeping things as they are. |
|
@ddbourgin i didn't see you closed #7 :) . I was thinking of doing a loss function do you have any suggestions ? |
|
That sounds great! Perhaps a cosine or KL-divergence loss? Depending on your enthusiasm, more sophisticated things like triplet loss or connectionist temporal classification loss would be awesome, though there's a reason why I've put them off ;-) |
|
@ddbourgin thats great will close this PR now :) |
Implemented Softsign activation and unit test for sotfsign

files changed
-../tests/tests.py