You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any way to change the loss function when we are applying backpropagation to train the network? I want to use cross entropy (CE) instead of mean squared error (MSE) which is the default.
I have read this issue here, but I see no explicit answer, and in my opinion it is very important to have at least most common loss functions implemented. There is a way to do it, but it involves changing source code of pybrain.
The text was updated successfully, but these errors were encountered:
Is there any way to change the loss function when we are applying backpropagation to train the network? I want to use cross entropy (CE) instead of mean squared error (MSE) which is the default.
I have read this issue here, but I see no explicit answer, and in my opinion it is very important to have at least most common loss functions implemented. There is a way to do it, but it involves changing source code of pybrain.
The text was updated successfully, but these errors were encountered: