-
Notifications
You must be signed in to change notification settings - Fork 19
Efficient GStreamer sink drawing #37
Comments
I think in theory it should be possible, with some modifications. WebRender is used to generate frames and render them but there's no requirement that everything go through a single path. It's not clear to me though how GstVideoOverlay can be synchronized with other drawing or with the window, could you signal it to render to the window after WebRender is done but before glutin swaps buffers for instance? Or is it normally only used with separate X windows that don't have other widgets drawn? Could you use OpenGl instead of using X directly? |
There is a glimagesink which at least sounds like it might use OpenGL directly. It still needs the Window ID to render frames onto the drawable. I think I have to do more research to learn exactly what that means, though. I'm not sure how you would synchronize the rendering with the swaps. In the past, I've been able to render to GTK DrawingArea elements inside windows with other elements, so it is possible to render without needed a completely separate X window. |
Maybe @sdroege can help out with coming up with a solution to this. |
How does rendering in limn work, what rendering API(s) does it use and what options are there to draw custom widgets? OpenGL would definitely be an option, for integrating that into GStreamer what would be needed from limn is to be able to get its own GL context so that GStreamer can create its own context that shares textures with the limn context. Also access to the display would be needed (for creating the context). This is basically how the GTK GL sink and the QML sink are working. For less tight integration, doing something around glimagesink could also be done. Either by directly integrating it with any limn GL context (some examples here are doing that IIRC), or just providing a window handle to glimagesink (EGL native window, X11 XID, Windows HWND, can't remember what on macOS/iOS, ANativeWindow on Android). Disadvantage of these two solutions would be that there's less possibility of communication between toolkit and video rendering, and especially in the second case (window handle) the toolkit would have no control over drawing (e.g. with the QML/GTK sinks you can use normal QML/GTK APIs for transforming the video, e.g. rotations, scaling, etc). Note that the GStreamer GL API is currently not covered by the GStreamer Rust bindings. That would have to be added, but due to not having a canonical GL crate (that I know of) this would probably become a bit awkward and low-level/unsafe in the public API. Let me know if you have questions or need further pointers/help or anything. |
The best way to integrate this will be to use WebRender's external image support. Gecko uses this to composite in video produced by our media stack. This means that GStreamer would provide a gl texture id as an external image. https://hg.mozilla.org/mozilla-central/file/tip/gfx/webrender_bindings contains the code to do this in Gecko. |
Will it be possible to draw onto elements in a limn window using a GStreamer ximagesink? With a traditional toolkit like GTK, GStreamer can efficiently render onto a location in an application by setting the xwindow ID of the GstVideoOverlay.
However, with GPU accelerated frameworks (e.g. Electron.js), this is difficult because everything goes through the GPU to render. The suggested solution is then to copy textures back and forth which adds a lot of overhead for each video.
Since limn is built on WebRender, I was wondering if this will have a similar issue like Electron, or if direct GStreamer rendering into a limn window could be possible down the road.
The text was updated successfully, but these errors were encountered: