Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improving inference time #14

Open
joshwa71 opened this issue Aug 20, 2022 · 1 comment
Open

Improving inference time #14

joshwa71 opened this issue Aug 20, 2022 · 1 comment

Comments

@joshwa71
Copy link

We are looking to run this model and reduce the processing time. Is there any way to tune hyperparameters and/or parallelise inference so that we can leverage more compute?

@Xzzit
Copy link

Xzzit commented Jun 27, 2023

Adding the --no_flip option to the command line in my case results in 5s improvement in the inference phase. Although there is a minor reduction in image quality, the time savings make it a worthwhile trade-off.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants