-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Raspberry Pi 4: Qt (PySide2) and pyrealsense2 frame buffer overflow #7968
Comments
Hi @zmukash The situation that you describes makes me think of a 16-frame limitation described in the link below. A solution that worked for one Python user was to save the frames in memory with the Keep() instruction. Keep() has its own limitation though in that it is suited to recording short sessions of up to 30 seconds length because the computer's memory capacity is consumed by the stored frames. |
Hi @MartyG-RealSense and thanks for the fast answer and for the further input from other issues! Sadly, I have to tell you that the solutions you provided to me were only partially helpful. Now it actually runs for more than those 16 frames but not more than those 30 seconds because the memory is used up. I actually need to have the stream running more than this...if possible infinitely! Isn't there any kind of release command in order to release the memory which was blocked by the references to the frames? In C++ this wouldn't be that much of an issue but in python the memory management is rather limited. When I am used the keep() command it also doesn't matter it i am closing and reopening the pipeline. I also changed my object variables to temporary one, tried to call python gc (garbage collection) and using the python del command but nothing prevents the stream from stopping because the memory is not actually freed. |
Under normal circumstances, the 400 Series cameras can run indefinitely so long as their temperature remains within recommended tolerances and there isn't a glitch outside of the camera (in the USB equipment, computer hardware, operating system, etc). Likewise, recording durations can run for long periods. Sometimes though, if a Python application is written in a certain way then it can cause the frames to reach a limit point where no further frames are received, like in the case linked to. There was a case of a Python user with a Keep() project that had non-releasing frames, and they found that the cause was their numpy version. |
So I investigated the issue with the memory leak and apparently it is the QImage (converter) function which was leaking memory in this context. I am going to provide further informations on this issue later this day (with small example) |
Thanks very much @zmukash for the forthcoming details |
Through debugging I saw that actually not the keep() command was leaking memory but the QImage(...) function (seen in first post in realsenseColorFrameToQImage() function. There is a workaround for this here. By simply replacing this I got the realsense to work without the keep() function but still somehow my UI stopped after some seconds. I did not investigate this issue further because I had already written an alternative (test-) programm which gave me much better results (see below). I used multiprocessing instead of threading. From my understanding this has the advantage that the memory is not directly shared between processes but communication is done through a Queue. Attention: Multiprocessing can throw exceptions when not using certain formats for communication. So this can be quite easily be solved by just queuing ndarrays (numpy). Also I got some struggle figuring out, that multiprocessing does not like to be handed specific class instances from another process. So my advice would there be: Make the process for data collection as much standalone as possible and don't try to make it use for example a pipeline() from another process. main.py from PySide2.QtWidgets import QApplication
from camera import Camera
from mainWindow import MainWindow
import sys
import multiprocessing as mp
import pyrealsense2 as rs
import numpy as np
#process method which is running in parallel
def foo(q,lock):
config = rs.config()
pipeline = rs.pipeline()
frames = rs.frame()
config.enable_stream(rs.stream.color, 640, 480, rs.format.bgr8, 30)
pipeline.start(config)
i = 0
while(True):
lock.acquire()
try:
frames = pipeline.wait_for_frames()
color_frame = np.asanyarray(frames.get_color_frame().get_data())
q.put(color_frame)
finally:
lock.release()
i = i+1
if __name__=="__main__":
#Starting the process method
lock = mp.Lock()
ctx=mp.get_context('spawn')
q = ctx.Queue(maxsize=4)
p = mp.Process(target=foo, args=(q,lock,))
p.start()
#Starting Qt Application Window
app = QApplication(sys.argv)
window = MainWindow(q)
window.show()
app.exec_()
p.join() mainWindow.py from PySide2.QtWidgets import QLabel, QVBoxLayout, QWidget, QMainWindow
from PySide2.QtGui import QPixmap, QImage
from PySide2.QtCore import Slot, Signal
from camera import Camera
import multiprocessing as mp
import numpy as np
import qimage2ndarray
class MainWindow(QMainWindow):
def __init__(self,q):
QMainWindow.__init__(self)
self.widget = QWidget()
self.color_label = QLabel("")
self.widgetLayout = QVBoxLayout()
self.widgetLayout.addWidget(self.color_label)
self.widget.setLayout(self.widgetLayout)
self.setCentralWidget(self.widget)
self.camera = Camera(q)
self.camera.colorFramesReady.connect(self.StartCameraUpdates)
self.camera.start()
def realsenseColorFrameToQImage(self, color_frame):
result =qimage2ndarray.array2qimage(color_frame)
return result
@Slot(np.ndarray)
def StartCameraUpdates(self,image):
self.color_label.setPixmap(QPixmap.fromImage(self.realsenseColorFrameToQImage(image))) camera.py from PySide2.QtCore import QThread, Signal, Slot
from PySide2.QtGui import QImage
import pyrealsense2 as rs
import multiprocessing as mp
import numpy as np
class Camera(QThread):
colorFramesReady = Signal(np.ndarray)
def __init__(self,q):
QThread.__init__(self)
self.q = q
print('QThread Opening...')
def run(self):
while True:
self.colorFramesReady.emit(self.q.get()) |
This is a great resource for RealSense community members in the future who might encounter similar problems. Thanks for sharing your solution! :) |
Issue Description
Hello Everybody!
I am currently trying to port a x86 Python/Qt Gui Application to Raspberry Pi 4.
My Problem is, that whenever i load the quite simple test-application i run into an issue which is quite awkward and I couldn't get my Head around it. When the application starts, the Qt UI Window opens and the video-stream starts playing (comes from a QThread but i tried it with python threading too) but only for exactly 16 frames. It doesn't matter if I reduce the resolution, framerate oder other stuff. Even when I "manually" add a timer.sleep(x) command and then just poll_for_frames() instead of wait_for_frames(). After those 16 Frames "RuntimeError: Frame didn't arrive within 5000" is thrown.
So I made some research on this and I think the problem could be that the frame buffer is overflowing and therfore no new frames arrive. Usually this is handled by the pyrealsense2 functions but I think, that there has to be some kind of "misplay" between the different libraries I use. I tried some coding with the frame_queue class and after those initial 16 Frames I tried to reinitialize the frame_queue object but I didn't get this to work properly. Still the same problem. Oddly, the examples from the realsense documentation work flawlessly and even if i rewrite them as python threads it works (opencv used for presentation). As soon as I try anything with Qt the issue occurs.
Does anyone have a similar problem or at least any useful information about this issue?
Following: the "cleanest" version of the code which at least on my raspberry pi ensures the reproduction of the issue I am enlisting here.
main.py
mainWindow.py
camera.py
The text was updated successfully, but these errors were encountered: