Skip to content

Conversation

@priyakasimbeg
Copy link
Contributor

Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. Workaround for #523.

Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. 
Workaround for #523.
@github-actions
Copy link

github-actions bot commented Oct 5, 2023

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@priyakasimbeg priyakasimbeg marked this pull request as ready for review October 7, 2023 03:09
@priyakasimbeg priyakasimbeg requested a review from a team as a code owner October 7, 2023 03:09
@priyakasimbeg priyakasimbeg merged commit 4131232 into dev Oct 7, 2023
@github-actions github-actions bot locked and limited conversation to collaborators Oct 7, 2023
@priyakasimbeg priyakasimbeg deleted the deepspeech-padding-change branch November 2, 2023 22:22
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants