Is there a way to keep models in memory? #311
Unanswered
KaruroChori
asked this question in
Q&A
Replies: 3 comments 10 replies
-
You can use this command line option to always keep them in memory: --highvram |
Beta Was this translation helpful? Give feedback.
8 replies
-
Hello, try this. #3545 |
Beta Was this translation helpful? Give feedback.
0 replies
-
How can you achieve this when using the API? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
It seems to me that for each task models are loaded and offloaded from the GPU memory at each task execution.
Would it be possible to keep them cached and make the generation process faster as long as they are still the same?
Because at the moment this seems to lead to a massive performance hit compared to auto1111, at least on my system.
Beta Was this translation helpful? Give feedback.
All reactions