Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[OTHER] Slow times of stopping the camera recording #1136

Open
ilirosmanaj opened this issue Oct 21, 2024 · 7 comments
Open

[OTHER] Slow times of stopping the camera recording #1136

ilirosmanaj opened this issue Oct 21, 2024 · 7 comments

Comments

@ilirosmanaj
Copy link

I am working on a module that uses two cameras connected to the board. Both of these cameras run on video stream mode (though for the sake of this mini example I've let one run in capture mode and one in video mode).

This is quite a time sensitive application so I need to make sure that the whole flow runs as quick as possible. However, in my main app I am noticing times when stoping the camera recording takes from 200-400ms, which seems to much.

Sample code:

import time
import io
from threading import Condition
from picamera2 import Picamera2
from picamera2.encoders import MJPEGEncoder, H264Encoder
from picamera2.outputs import FileOutput


def print_time(start_time: float, end_time: float, message: str) -> None:
    print(f"{message} took: {(end_time - start_time):.3f}s")

class StreamingOutput(io.BufferedIOBase):
    def __init__(self):
        self.frame = None
        self.condition = Condition()

    def write(self, buf):
        with self.condition:
            self.frame = buf
            self.condition.notify_all()


STREAMING_OUTPUT = StreamingOutput()

MJPEG_ENCODER = MJPEGEncoder(bitrate=14 * 1000000)
FILE_OUTPUT = FileOutput(STREAMING_OUTPUT)

camera1 = Picamera2(0)
camera2 = Picamera2(1)

start_time = time.perf_counter()
camera1.start_recording(MJPEG_ENCODER, FILE_OUTPUT)
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 1 start duration")

start_time = time.perf_counter()
camera2.start()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 2 start duration")

start_time = time.perf_counter()
camera1.stop_recording()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 1 stop duration")

start_time = time.perf_counter()
camera2.stop()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 2 stop duration")

Sample output:

$ python picamera_stop.py 
[3:08:31.847773757] [143896]  INFO Camera camera_manager.cpp:297 libcamera v0.0.5+83-bde9b04f
[3:08:32.118285752] [143922]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@0/imx708@1a to Unicam device /dev/media4 and ISP device /dev/media1
[3:08:32.118519752] [143922]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[3:08:32.163806008] [143922]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@1/imx708@1a to Unicam device /dev/media5 and ISP device /dev/media2
[3:08:32.164103342] [143922]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[3:08:32.212697623] [143896]  INFO Camera camera.cpp:1033 configuring streams: (0) 1280x720-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[3:08:32.215352887] [143922]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@0/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 1 start duration took: 0.410s
[3:08:32.624291895] [143896]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[3:08:32.628214403] [143922]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@1/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 2 start duration took: 0.305s
Camera 1 stop duration took: 0.206s
Camera 2 stop duration took: 0.019s

Notice how much quicker the stop is compared to stop_recording.

Any ideas?

@davidplowman
Copy link
Collaborator

First thing to note is that stop_recording() just calls stop() followed by stop_encoder(), so it might be worth breaking those out to measure those two times separately (also explains why the extra stop() is quick - it is already stopped!).

Also, maybe put a sleep(1) before stopping everything because the behaviour of stopping-right-as-things-are-starting might be slightly different. Then let's see what numbers you get.

@ilirosmanaj
Copy link
Author

With splitting the stop_recording() to stop and stop_encoder, like the the following:

import time
import io
from threading import Condition
from picamera2 import Picamera2
from picamera2.encoders import MJPEGEncoder, H264Encoder
from picamera2.outputs import FileOutput


def print_time(start_time: float, end_time: float, message: str) -> None:
    print(f"{message} took: {(end_time - start_time):.3f}s")

class StreamingOutput(io.BufferedIOBase):
    def __init__(self):
        self.frame = None
        self.condition = Condition()

    def write(self, buf):
        with self.condition:
            self.frame = buf
            self.condition.notify_all()


STREAMING_OUTPUT = StreamingOutput()

MJPEG_ENCODER = MJPEGEncoder(bitrate=14 * 1000000)
FILE_OUTPUT = FileOutput(STREAMING_OUTPUT)

camera1 = Picamera2(0)
camera2 = Picamera2(1)

start_time = time.perf_counter()
camera1.start_recording(MJPEG_ENCODER, FILE_OUTPUT)
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 1 start duration")

start_time = time.perf_counter()
camera2.start()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 2 start duration")

start_time = time.perf_counter()
camera1.stop()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 1 stop duration")

start_time = time.perf_counter()
camera1.stop_encoder()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 1 stop encoder duration")

start_time = time.perf_counter()
camera2.stop()
print_time(start_time=start_time, end_time=time.perf_counter(), message="Camera 2 stop duration")

The results:

[0:36:22.649681623] [33134]  INFO Camera camera_manager.cpp:297 libcamera v0.0.5+83-bde9b04f
[0:36:22.924540197] [33160]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@0/imx708@1a to Unicam device /dev/media4 and ISP device /dev/media2
[0:36:22.926272604] [33160]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[0:36:22.975133308] [33160]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@1/imx708@1a to Unicam device /dev/media5 and ISP device /dev/media3
[0:36:22.975369826] [33160]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[0:36:23.022305975] [33134]  INFO Camera camera.cpp:1033 configuring streams: (0) 1280x720-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[0:36:23.024149104] [33160]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@0/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 1 start duration took: 0.406s
[0:36:23.429614808] [33134]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[0:36:23.432676271] [33160]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@1/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 2 start duration took: 0.273s
Camera 1 stop duration took: 0.022s
Camera 1 stop encoder duration took: 0.181s
Camera 2 stop duration took: 0.022s

If I add a sleep(1) between stop and stop_encoder, the output is almost identical as well.

@davidplowman
Copy link
Collaborator

Thanks for the update. So it sounds like stopping a camera is taking about 0.2 seconds, the bulk of which is stopping the encoder. It looks from the debug like this might be a Compute Module 4, is that right?

Stopping the encoder is probably a bit slow because it does flush out all the submitted frames before stopping. Maybe that's something that could be changed? If you were up for some experimentation, you could go to this line and replace it by

        while self._running:

(Strictly, we should add some code after the while loop to empty the queue and release() anything in it, but if you're stopping the camera as well then that shouldn't matter.) Let's see what that does.

@ilirosmanaj
Copy link
Author

Yes, it's a CM4 module.

I am running on picamera2 0.3.12 version. This version has this thread_poll behavior:

   def thread_poll(self, buf_available):
        """Outputs encoded frames"""
        pollit = select.poll()
        pollit.register(self.vd, select.POLLIN)

        while self._running:
            for _, event in pollit.poll(200):
                if event & select.POLLIN:
                    buf = v4l2_buffer()
                    planes = v4l2_plane * VIDEO_MAX_PLANES
                    planes = planes()
                    buf.type = V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE
                    buf.memory = V4L2_MEMORY_DMABUF
                    buf.length = 1
                    buf.m.planes = planes
                    ret = fcntl.ioctl(self.vd, VIDIOC_DQBUF, buf)

                    if ret == 0:
                        buf_available.put(buf.index)

                    buf = v4l2_buffer()
                    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE
                    buf.memory = V4L2_MEMORY_MMAP
                    buf.length = 1
                    ctypes.memset(planes, 0, ctypes.sizeof(v4l2_plane) * VIDEO_MAX_PLANES)
                    buf.m.planes = planes
                    ret = fcntl.ioctl(self.vd, VIDIOC_DQBUF, buf)
                    keyframe = (buf.flags & V4L2_BUF_FLAG_KEYFRAME) != 0

                    if ret == 0:
                        bufindex = buf.index
                        buflen = buf.m.planes[0].length

                        # Write output to file
                        b = self.bufs[buf.index][0].read(buf.m.planes[0].bytesused)
                        self.bufs[buf.index][0].seek(0)
                        self.outputframe(b, keyframe, (buf.timestamp.secs * 1000000) + buf.timestamp.usecs)

                        # Requeue encoded buffer
                        buf = v4l2_buffer()
                        planes = v4l2_plane * VIDEO_MAX_PLANES
                        planes = planes()
                        buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE
                        buf.memory = V4L2_MEMORY_MMAP
                        buf.index = bufindex
                        buf.length = 1
                        buf.m.planes = planes
                        buf.m.planes[0].bytesused = 0
                        buf.m.planes[0].length = buflen
                        ret = fcntl.ioctl(self.vd, VIDIOC_QBUF, buf)

                        # Release frame from camera
                        queue_item = self.buf_frame.get()
                        queue_item.release()

This version is already reliant on self._running flag only.

Is there any trick we could do to modify the encoder state from the outside? I tried setting MJPEG_ENCODER.buff_frame = queue.Queue() to clean up, but still the same issue.

@davidplowman
Copy link
Collaborator

Oh, one thing I didn't notice, what happens if you do stop_encoder() before stop()?

@ilirosmanaj
Copy link
Author

Still the same:

[18:00:07.632261437] [821756]  INFO Camera camera_manager.cpp:297 libcamera v0.0.5+83-bde9b04f
[18:00:07.904199279] [821779]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@0/imx708@1a to Unicam device /dev/media4 and ISP device /dev/media2
[18:00:07.904448900] [821779]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[18:00:07.951309393] [821779]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@1/imx708@1a to Unicam device /dev/media5 and ISP device /dev/media3
[18:00:07.951554347] [821779]  INFO RPI pipeline_base.cpp:1101 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[18:00:07.996205553] [821756]  INFO Camera camera.cpp:1033 configuring streams: (0) 1280x720-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[18:00:07.998825479] [821779]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@0/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 1 start duration took: 0.397s
[18:00:08.395545369] [821756]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888 (1) 1536x864-SBGGR10_CSI2P
[18:00:08.398453105] [821779]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@1/imx708@1a - Selected sensor format: 1536x864-SBGGR10_1X10 - Selected unicam format: 1536x864-pBAA
Camera 2 start duration took: 0.263s
Camera 1 stop encoder duration took: 0.206s
Camera 1 stop duration took: 0.017s
Camera 2 stop duration took: 0.020s

@davidplowman
Copy link
Collaborator

Can't explain that. You might have to put some instrumentation into the code to see when things are happening. stop_encoder() can certainly take a frame time to wake up and stop (so long as the camera is still running), but I don't really see how things can take longer than that. It flushes and closes the output too, but again, I'm thinking that should be quick enough too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants