-
Notifications
You must be signed in to change notification settings - Fork 173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove logsoftmax #513
Remove logsoftmax #513
Conversation
…that it will be remove in the future
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thx @Sara04 Great job! I like the idea of convention of having out_fun
and the approach of LogSoftmax/Identity. Some details I want to discuss:
- I have doubts whether property called
add_log_softmax
is not misleading, suggesting that there is an action of adding log_softmax being performed. I have commented with some propositions (@bruAristimunha, @PierreGtch please tell what you think as well). - If I understand well this PR we need to add
add_log_softmax
param in all the models as a parameter and then use it in thesuper.__init__()
. In most of the models it is always set to True insuper.__init__()
, I left comments for some of the cases here.
Edit: I didn't notice that it is a draft, so maybe reviewed too early 😂
Thank you very much for your comments, @sliwy ! 😄 |
Hi @sliwy, thanks for your review! We discuss your comments with Sara and I will answer them individually |
@Sara04 @PierreGtch based on this #513 (comment) I think that I may not understand well what the flow of logsoftmax is going to look like. For now, all models (or almost all) are created with softmax in the end. In the new flow if a user wants to have classification model would need to add softmax on their own? |
Codecov Report
@@ Coverage Diff @@
## master #513 +/- ##
==========================================
- Coverage 84.37% 84.32% -0.05%
==========================================
Files 63 63
Lines 4545 4569 +24
==========================================
+ Hits 3835 3853 +18
- Misses 710 716 +6 |
Yes exactly, the user will have to add the softmax himself as this is standard practice. But in practice, you will not need to because if you use the |
ok, cool, sounds awesome! |
Can we merge this @Sara04, @PierreGtch and @sliwy? Looks okay for me. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just two small changes in case the add_log_softmax
was already there before and defaulted to false: we should keep the same default.
Otherwise all good to merge! Thanks a lot @Sara04 for all this work!!!
Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com>
Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com>
changes were made
Thank you @Sara04, @PierreGtch, @sliwy! |
* Setting up logsoft max as an optional output function with a warning that it will be remove in the future * Fix warning * Remove add_log_softmax from mother class init and put it in model's init; fix hybrid and tidnet * Move deprecation check before mother class init * Remove deprecated argument from usleep * Remove unused arguments, remove unnecessary part from docstring, update examples * Removing conversion of classifier to regressor in examples * Removing comment * Flake8 * Update braindecode/models/usleep.py Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com> * Update braindecode/models/tcn.py Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com> * Fixing the warning --------- Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com> Co-authored-by: bruAristimunha <a.bruno@aluno.ufabc.edu.br>
Thanks @PierreGtch , @sliwy , @bruAristimunha ! |
Addressing the issue #511 by setting up the flag add_logsoftmax with a warning that the LogSoftmax as a final layer will be removed in the future. Proposed by @PierreGtch . Cf #457