Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dev -> main #536

Merged
merged 65 commits into from
Oct 9, 2023
Merged

dev -> main #536

merged 65 commits into from
Oct 9, 2023

Conversation

priyakasimbeg
Copy link
Contributor

No description provided.

priyakasimbeg and others added 24 commits September 30, 2023 00:36
Remove global_batch_size arg in call to shard_and_maybe_pad batch call. This will result in the final batch of the validation and test sets for librispeech being just padded just enough so that it can be split equally amongst the devices. So we will not have device batches containing all padding. 
Workaround for #523.
Remove test target from scoring
Adjust runtime budget for self-tuning ruleset and check that tuning search space is `None`
Change padding for Deepspeech LSTM layer
@github-actions
Copy link

github-actions bot commented Oct 7, 2023

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@priyakasimbeg priyakasimbeg changed the title [do not merge] dev -> main dev -> main Oct 9, 2023
@priyakasimbeg priyakasimbeg marked this pull request as ready for review October 9, 2023 18:04
@priyakasimbeg priyakasimbeg requested a review from a team as a code owner October 9, 2023 18:04
@priyakasimbeg priyakasimbeg merged commit e19dacf into main Oct 9, 2023
@github-actions github-actions bot locked and limited conversation to collaborators Oct 9, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants