-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Viewport preview not working #1
Comments
Uuuh, looks like a problem with my code to copy the render result into Blender's frame buffer via OpenGL in C++. Maybe I will see |
I was thinking maybe it was the edge case of me having multiple GPUs - a laptop one and an external one. However, I'm only really using one of them, which is an external GPU to which my monitors are connected. I'd be good to just ignore the internal one? |
This has to be fixed anyway. I don't think it's worth to play with the setup. Never touch a running system :) |
I added 2 additional methods to copy the render result into Blender's frame buffer. They are selectable via "Method to blit to frame buffer" in the Render properties:
I hope than "python bgl texture" or "gpu_extras" works for you. If "python bgl texture" does not work, it has at least the possibility to test out things in python without C++ compilation. Only the Blender add-on has to be updated. I attached it and will make an official release next week (I'm not in office from thursday). This is a bit of a quick fix, I really tested it only on Blender 3.2. May not work on 3.1 or 2.93 Thanks for reporting and testing :) |
I can confirm that the 'gpu extras' method works! The other two do not (in the viewport). Seems quite slow indeed. I can continue providing logs later if you're going to keep trying make the faster methods work - I'll be checking the engine out to see if it can be useful to me, however incomplete the featureset is yet. I do appreciate the reuse of existing shader nodes! |
Well, at least the last fallback works :) Sadly it will hide most of the 3090 power... But my initial guess that something with the C++ side is broken seems to be wrong. I will check the OpenGL code more next |
Two more fixes:
These changes are in "C++ OpenGL texture" and "python bgl texture". Only the add-on needs to be updated. I attached it: Please note that the log output from the add-on itself is not redirected into "Absolute path of log file". I need to document |
Okay, so the default C++ mode works for me now! I tried switching to other modes - don't see any difference in terms of performance so either the switching didn't work or the performance is similar - but I'll need to do a proper scene, not just a box with a light. I'll do some logging once I learn how to pipe console output on Windows. I had a funny bug at the first try today, when scaling an object up - my eGPU got turned off and I had to reboot. |
That's great! If the C++ mode works now, we do not really need the other ones. I don't think it is worth to put effort into it. BTW: console output on Windows goes like this (you may want to use a full path for the log file):
Maybe an overheating problem? DXR raytracing uses a lot of power and once blRstr has all the data it needs it can reach >95% GPU usage. Which screen resolution are you running?
The environment node is a known problem. It is a bit hidden in the readme under point "World shaders are buggy...". Thank you for your help so far |
As for the GPU, I'm running UHD resolution. It's good to have resolution scaling like in Cycles or Unreal Engine, but a simple scene should be fairly responsive... there's also DLSS but I don't remember testing it much yet in UE. |
Yes, that should be enough :) |
Tested using 3.1 and 3.2. F12 render works, but viewport rendering does not. I'm getting a repeated 'ERROR 1282 in glUseProgram' error.
2022-06-22_log.txt
'
The text was updated successfully, but these errors were encountered: