Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

depth frame size is different when we start and stop the pipe after receiving one frame #10432

Closed
sarath-kumar1 opened this issue Apr 22, 2022 · 11 comments
Labels

Comments

@sarath-kumar1
Copy link

sarath-kumar1 commented Apr 22, 2022

Hi,
i am using Real sense D435i camera to capture depth frames.
when i am running my application with stream enabled (width = 1280, height = 720) , i am getting always depth image of size 2764800 bytes which is valid but it is causing the increase of process size in RAM upto 2GB (95%).
To avoid this i started doing pipe.stop() , pipe.start() after receiving each frame, Now process size is not improving much in RAM, but it is causing decrease in depth frame size.

Please find below log for more info:

void __cdecl DeviceRealSense::dataReceived(void) Recieved Real Sense Data
depth.get_data_size() 2764800
void __cdecl DeviceRealSense::dataReceived(void) Recieved Real Sense Data
depth.get_data_size() 1221120

Need help on, how to avoid the application to use memory upto 95% and collect depth frames with no loss.

@MartyG-RealSense
Copy link
Collaborator

Hi @sarath-kumar1 I would definitely not recommend performing a start-stop every frame except when creating a camera stress-test application to see how stability holds up after repeated start and stops.

It sounds as though your application is experiencing a memory leak where the computer's free memory capacity is progressively consumed over time whilst the program is running instead of the memory being released.

Would it be possible to post your original script (the one without start-stop) in a comment below, please?

@sarath-kumar1
Copy link
Author

sarath-kumar1 commented Apr 22, 2022

Dear @MartyG-RealSense , please find below my sample Qt code to capture depth frames

//Derived class method which triggers for every 33 ms to read frame data
void DeviceRealSense::dataReceived()
{
rs2::context ctx;
rs2::device_list devices=ctx.query_devices();
if(devices.size()>0)
{
qDebug()<<Q_FUNC_INFO<<"Recieved Real Sense Data";
rs2::frameset frames = m_pipe.wait_for_frames();
rs2::frame IRLeft = frames.get_infrared_frame(0);
rs2::frame IRRight = frames.get_infrared_frame(1);
rs2::frame depth = frames.get_depth_frame().apply_filter(color_map);
rs2::frame rgbFrame = frames.get_color_frame();
qDebug()<<"depth.get_data_size()"<<depth.get_data_size();

    DeviceData *device = new DeviceData;
    device->setBuffer((char*)depth.get_data(),depth.get_data_size());
    // m_pipe.stop();
   // m_pipe.start();
    emit deviceDetails(device);
}
else
{
    emit connected(INTEL_REALSENSE,0,false);
}

}

// this is base class for checking device connectivity and starting a pipe
void Device::connectDevice(uint timeout)
{
connect(&m_timer, SIGNAL(timeout()), this, SLOT(dataReceived()));
m_timer.start(33);
rs2::config cfg;

    cfg.enable_stream(RS2_STREAM_COLOR, 1280, 720, RS2_FORMAT_RGBA8, 30);
    cfg.enable_stream(RS2_STREAM_DEPTH, 1280, 720, RS2_FORMAT_Z16, 30);
    cfg.enable_stream(RS2_STREAM_INFRARED, 1, 1280, 720, RS2_FORMAT_Y8, 30);
    cfg.enable_stream(RS2_STREAM_INFRARED, 2, 1280, 720, RS2_FORMAT_Y8, 30);
    mConnectStatus = false;
    if(get_realsense_device_status())
    {
        mConnectStatus = true;
        m_pipe.start(cfg);
        qDebug()<<"Depth connected";
        emit connected(m_DeviceType,m_deviceID,true);
}

}

@MartyG-RealSense
Copy link
Collaborator

Does any part of your project make use of a Qt function called QImage

A RealSense user who was using Qt and had an apparent memory leak stated in #7968 (comment) that their debugging found that QImage was causing it in their particular project.

@sarath-kumar1
Copy link
Author

i am not using QImage in this scenario, i am just copying frame data to a buffer and streaming it on TCP/IP.
still it causing RAM Size increasing very fast

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 24, 2022

Looking at your code in thevoid DeviceRealSense::dataReceived() function, the rs2:: instructions appear to be called every 33 ms to update them.

The two instructions below are ones that should be placed before the pipe.start instruction and therefore run only once when the program launches, not looped.

rs2::context ctx;
rs2::device_list devices=ctx.query_devices();

The SDK's rs-multicam C++ script that uses ctx / query_devices code provides a demonstration of this principle of placing these instructions before the pipe start.

https://github.com/IntelRealSense/librealsense/blob/master/examples/multicam/rs-multicam.cpp#L15-L31

I wonder whether you are aiming to query the camera(s) every 33 ms to check whether they are still connected. If so then this would not be a recommended approach. Instead, the 'correct' way would be to listen for camera connect and disconnect events with a C++ instruction such as set_devices_changed_callback, as described at #931

@sarath-kumar1
Copy link
Author

Dear @MartyG-RealSense ,
i have reimplemented the dataReceived() method as you suggested, but i can see still memory is going high and causing my application crash at certain point of time.
New implementation:

void DeviceRealSense::dataReceived()
{
bool disconnected = false;
try {
if(!disconnected)
{
rs2::frameset frames = m_pipe.wait_for_frames(500);
rs2::frame depth = frames.get_depth_frame().apply_filter(color_map);
DeviceData *device = new DeviceData;
device->setTime((depth.get_timestamp()*1e6)+((Settings::getInstance()->getTimeZoneOffset())1e11));
device->setBuffer((char
)depth.get_data(),depth.get_data_size());
emit deviceDetails(device);
}
else
{
emit connected(INTEL_REALSENSE,0,false);
}
}
catch(const rs2::error &e)
{
qDebug()<<"RealSense disconnected ";
disconnected = true;
}

@MartyG-RealSense
Copy link
Collaborator

Does it still occur if you remove '500' from the wait_for_frames() brackets?

In #6569 (comment) a RealSense team member suggests an alternative approach to using rs2::frame depth = frames.get_depth_frame().apply_filter(color_map);

@sarath-kumar1
Copy link
Author

Dear @MartyG-RealSense ,
i tried removing '500' from the wait_for_frames(), but still no luck!

@MartyG-RealSense
Copy link
Collaborator

Have you tried running a simple C++ example like the RealSense SDK's Ready to Hack test script at the link below to check whether the memory leak still occurs when running another script? If it does then it could indicate that the memory leak problem is not related to your code but to something happening elsewhere, such as in Qt

https://github.com/IntelRealSense/librealsense#ready-to-hack

@sarath-kumar1
Copy link
Author

hi @MartyG-RealSense , we tried with simple standalone app for just reading frames , and found no memory leaks while reading frames.
Thanks for your help. you can close this ticket. :)

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for the update, @sarath-kumar1 :) As you suggested, I will close the case. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants