Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change padding for Deepspeech LSTM layer #533

Merged
merged 1 commit into from
Oct 7, 2023

Conversation

priyakasimbeg
Copy link
Contributor

Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. Workaround for #523.

Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. 
Workaround for #523.
@github-actions
Copy link

github-actions bot commented Oct 5, 2023

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@priyakasimbeg priyakasimbeg marked this pull request as ready for review October 7, 2023 03:09
@priyakasimbeg priyakasimbeg requested a review from a team as a code owner October 7, 2023 03:09
@priyakasimbeg priyakasimbeg merged commit 4131232 into dev Oct 7, 2023
31 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Oct 7, 2023
@priyakasimbeg priyakasimbeg deleted the deepspeech-padding-change branch November 2, 2023 22:22
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant