Skip to content

Commit

Permalink
Change padding for Deepspeech LSTM layer
Browse files Browse the repository at this point in the history
Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. 
Workaround for #523.
  • Loading branch information
priyakasimbeg authored Oct 5, 2023
1 parent 3351f73 commit a664134
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ def _build_input_queue(
}

padded_batch = data_utils.shard_and_maybe_pad_np(
numpy_batch, padding_value=1.0, global_batch_size=global_batch_size)
numpy_batch, padding_value=1.0)
yield padded_batch

# Does NOT apply regularization, which is left to the submitter to do in
Expand Down

0 comments on commit a664134

Please sign in to comment.