Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

Fix loss reporting #453

Closed
wants to merge 1 commit into from

Conversation

kmalik22
Copy link

@kmalik22 kmalik22 commented Apr 8, 2019

Summary:
When computing average loss for train/eval/test, loss across different batches should be weighted with batch size.

This doesn't matter for standard training where most batches are the full batch size except the last. However, when training in a massively distributed fashion (eg: for Federated Learning), data is split into small sizes. Many batches are not 'complete' batches, so weighting by batch size is important.

Differential Revision: D14791247

Summary:
When computing average loss for train/eval/test, loss across different batches should be weighted with batch size.

This doesn't matter for standard training where most batches are the full batch size except the last. However, when training in a massively distributed fashion (eg: for Federated Learning), data is split into small sizes. Many batches are not 'complete' batches, so weighting by batch size is important.

Differential Revision: D14791247

fbshipit-source-id: 5d99ce1adf6e142aba8963e9fca8bbc77787df59
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Apr 8, 2019
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in fb58d20.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants