Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added test for Cross Entropy derivative #80

Merged
merged 3 commits into from
Mar 1, 2017

Conversation

nishnik
Copy link
Contributor

@nishnik nishnik commented Mar 1, 2017

While looking in coveralls, I found that deriv for Crossentropy is not called even once.

Added proper documentation for loss functions in other.jl

If fine, I would extend it to deriv2.

@Evizero
Copy link
Member

Evizero commented Mar 1, 2017

Hi! Yes the crossentropy one is a bit of a loner. It is a bit of a hack and doesn't quite fit in the overall design, yet it seems useful to have.

@nishnik
Copy link
Contributor Author

nishnik commented Mar 1, 2017

@Evizero is the current implementation of test_deriv fine ?
If yes, I can add a patch for deriv2

@Evizero
Copy link
Member

Evizero commented Mar 1, 2017

Looks good to me! does it pass the tests?

@nishnik
Copy link
Contributor Author

nishnik commented Mar 1, 2017

It passes locally on my laptop, will have to wait for travis checks.


immutable CrossentropyLoss <: SupervisedLoss end
typealias LogitProbLoss CrossentropyLoss

function value(loss::CrossentropyLoss, target::Number, output::Number)
target >= 0 && target <=1 || error("target must be in [0,1]")
output >= 0 && output <=1 || error("output must be in [0,1]")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am unsure if we should do such checks. we don't usually do them because of the nature of these functions (they get called very often in some inner loop). So normally we assumer the caller knows what he/she is doing.

I am ok with this approach as well though.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would vote to remove the checks personally. But I don't feel strongly enough to hold up a merge

Copy link
Contributor Author

@nishnik nishnik Mar 1, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Evizero @tbreloff
Agreed with your concerns, but I think checks are necessary (espescially for libraries which other libraries are going to use). I would prefer an error rather than a strange behaviour in my code.

There are checks in PeriodicLoss, HuberLoss and epsilon losses.

Though I am not much experienced but in SymPy they have used checks for every possible corner case. Also Julia is awesome in showing errors:

ERROR: output must be in [0,1]
in value(::LossFunctions.CrossentropyLoss, ::Int64, ::Int64) at /home/nishnik/.julia/v0.5/LossFunctions/src/supervised/other.jl:44

Sorry, I am not much experienced in Julia to strongly pitch this, it is just my thinking. If you want me to remove checks I would do it, but I am a little reluctant to do this.
Otherwise ready to merge from my side.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are checks in PeriodicLoss, HuberLoss and epsilon losses.

The difference there is that the check is in the constructor, while here it is in the computational kernel. The later is executed much more often in real use.

Let's keep it for now

@nishnik
Copy link
Contributor Author

nishnik commented Mar 1, 2017

Soft ping @Evizero !
Could you please review.

Copy link
Member

@Evizero Evizero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me, ready to merge if you are

Copy link
Member

@tbreloff tbreloff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If @Evizero approves, so do I


immutable CrossentropyLoss <: SupervisedLoss end
typealias LogitProbLoss CrossentropyLoss

function value(loss::CrossentropyLoss, target::Number, output::Number)
target >= 0 && target <=1 || error("target must be in [0,1]")
output >= 0 && output <=1 || error("output must be in [0,1]")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would vote to remove the checks personally. But I don't feel strongly enough to hold up a merge

@Evizero
Copy link
Member

Evizero commented Mar 1, 2017

Thanks for working on this

@Evizero Evizero merged commit 086367e into JuliaML:master Mar 1, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants