Skip to content

An implementation of focal loss in pytorch meant to be understandable and easily swappable with nn.functional.cross_entropy and nn.CrossEntropyLoss

Notifications You must be signed in to change notification settings

daveboat/pytorch_focal_loss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Focal Loss in Pytorch

May, 2020: See my updated implementation in github.com/daveboat/dy_common, which adds support for label smoothing.


This is a pytorch implmentation of focal loss (https://arxiv.org/abs/1708.02002), meant to be understandable and easily swappable with nn.CrossEntropyLoss and F.cross_entropy. This implementation includes the ignore_index parameter like pytorch's cross entropy functions.

Everything is in focalloss.py and should be well-commented enough to be self-explanatory. You can either use the regular function version or the nn.Module-wrapped version.

About

An implementation of focal loss in pytorch meant to be understandable and easily swappable with nn.functional.cross_entropy and nn.CrossEntropyLoss

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages