Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Change padding for Deepspeech LSTM layer
Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. Workaround for #523.
- Loading branch information