SDXL? #38
Replies: 3 comments
-
Hi, I plan to have a look soon. VRAM may be an issue but with some loading/unloading I'm sure it'll work eventually. If I can get it going I'll add it to the repo (possibly with a temporary UI). |
Beta Was this translation helpful? Give feedback.
-
there's the official onnx here: https://huggingface.co/stabilityai/stable-diffusion-xl-1.0-tensorrt (it's "tensorrt" in the title but the files are onnx) |
Beta Was this translation helpful? Give feedback.
-
I will look into it, but each model that i saw was full32 instead of 16, that increases a lot the memory consumption of the model |
Beta Was this translation helpful? Give feedback.
-
Hello, i would like to ask if there will be any option to convert SDXL model to ONNX with fp16?, the script within optimum only works with CUDA cards...(O4)
Or maybe someone could share a FP16 version of the model already converted into ONNX?...
Beta Was this translation helpful? Give feedback.
All reactions