Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable exporting for inference when loading from buffer without behavior changes #21601

Merged
merged 4 commits into from
Aug 9, 2024

Conversation

carzh
Copy link
Contributor

@carzh carzh commented Aug 2, 2024

Description

Added eval model buffer as optional field in Module so that you can export for inference using the eval model stored as a buffer.

Motivation and Context

Copy link
Contributor

@skottmckay skottmckay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice. Much cleaner.

@carzh carzh merged commit eeef0c8 into main Aug 9, 2024
95 of 98 checks passed
@carzh carzh deleted the carzh/export_for_inference_buffer branch August 9, 2024 23:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Training] Cannot export model for inferencing from session created from buffers
3 participants