Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Add options for focal loss #3036

Merged
merged 30 commits into from
Jul 10, 2019
Merged

Add options for focal loss #3036

merged 30 commits into from
Jul 10, 2019

Conversation

guoquan
Copy link
Contributor

@guoquan guoquan commented Jul 6, 2019

Add focal loss to deal with class imbalance

Add focal loss to deal with class imbalance
@guoquan
Copy link
Contributor Author

guoquan commented Jul 6, 2019

Sorry, it seems there are some problems. But it works fine in my local copy.
Can anyone send me the error message from CI?

@matt-gardner
Copy link
Contributor

@guoquan
Copy link
Contributor Author

guoquan commented Jul 7, 2019

Thanks. Did not see the guest option. Will get back when I figure this out.

@guoquan
Copy link
Contributor Author

guoquan commented Jul 8, 2019

Finally here. Not an easy job to make these checkers happy.

@schmmd schmmd requested a review from matt-gardner July 9, 2019 15:35
Copy link
Contributor

@matt-gardner matt-gardner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Largely looks great! There are just a few minor things that could use some fixing, then this is good to merge. Thanks for the contribution!

allennlp/nn/util.py Outdated Show resolved Hide resolved
allennlp/nn/util.py Outdated Show resolved Hide resolved
allennlp/tests/nn/util_test.py Outdated Show resolved Hide resolved
allennlp/tests/nn/util_test.py Outdated Show resolved Hide resolved
For more clear cross_entropy formulation
Combine everything into `weights` and avoid reference to local variables later.
Add `@flaky` also to token-average tests.
@guoquan
Copy link
Contributor Author

guoquan commented Jul 10, 2019

It seems moving where to combine the weight brings in new problem. Will look at it later.

Avoid involving `gamma` or `alpha` in average.
More tolerance to token average so the change it complaint < 1/1000.
Copy link
Contributor

@matt-gardner matt-gardner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks @guoquan!

@matt-gardner matt-gardner merged commit ebe9113 into allenai:master Jul 10, 2019
@guoquan guoquan deleted the patch-2 branch July 10, 2019 22:51
reiyw pushed a commit to reiyw/allennlp that referenced this pull request Nov 12, 2019
* Add options for focal loss

Add focal loss to deal with class imbalance

* Fix some typing problems

* Update util.py

* Update util.py

* Update util.py

* Fix too long lines

* Fix keyword argument alert

* Update util.py

* Update util.py

* Fix a problem focal loss not activated

* Add test for focal loss gamma

* Fix some problem with decimal precision problems

* Add focal loss alpha test

* Update util_test.py

* Update util.py

* Update util_test.py

* Address some pylint and mypy problems

* Update util.py

* Update util.py

* Update util.py

* Update util.py

* Update util_test.py

* Update util_test.py

It was in the wrong place. Sorry.

* restore not-callable after torch.tensor()

* Update util_test.py

For more clear cross_entropy formulation

* Update util.py

Combine everything into `weights` and avoid reference to local variables later.

* Update util_test.py

Add `@flaky` also to token-average tests.

* Update util.py

Avoid involving `gamma` or `alpha` in average.

* Update util_test.py

* Update util_test.py

More tolerance to token average so the change it complaint < 1/1000.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants