Skip to content

Conversation

@priyakasimbeg
Copy link
Contributor

No description provided.

priyakasimbeg and others added 24 commits September 30, 2023 00:36
Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. 
Workaround for #523.
Remove test target from scoring
Adjust runtime budget for self-tuning ruleset and check that tuning search space is `None`
Change padding for Deepspeech LSTM layer
@github-actions
Copy link

github-actions bot commented Oct 7, 2023

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@priyakasimbeg priyakasimbeg changed the title [do not merge] dev -> main dev -> main Oct 9, 2023
@priyakasimbeg priyakasimbeg marked this pull request as ready for review October 9, 2023 18:04
@priyakasimbeg priyakasimbeg requested a review from a team as a code owner October 9, 2023 18:04
@priyakasimbeg priyakasimbeg merged commit e19dacf into main Oct 9, 2023
@github-actions github-actions bot locked and limited conversation to collaborators Oct 9, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants