Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss function in BackpropTrainer #231

Open
mfornet opened this issue Dec 6, 2017 · 0 comments
Open

Loss function in BackpropTrainer #231

mfornet opened this issue Dec 6, 2017 · 0 comments

Comments

@mfornet
Copy link

mfornet commented Dec 6, 2017

Is there any way to change the loss function when we are applying backpropagation to train the network? I want to use cross entropy (CE) instead of mean squared error (MSE) which is the default.

I have read this issue here, but I see no explicit answer, and in my opinion it is very important to have at least most common loss functions implemented. There is a way to do it, but it involves changing source code of pybrain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant