-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tune max_split_size_mb for pytorch memory allocator to 256 for conformer #522
Conversation
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
@pomonam found this information:
So I will change this PR to tune it just for the conformer workload. |
This is a temporary workaround for Conformer OOM issue #497. It slows down the conformer workload by 2x so we will have to find a different long term solution.
Also fixes bug relating to saving the metadata.