-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consumes a huge amount of memory during processing #1527
Comments
What's the dtype of the binary file? Could you provide a bit more info about what the One thing you could try would be to do the same at the rawio level. Have you used that before? I'm wondering if reshape is causing the huge memory spike. If you test the rawio level and it still has the memory spike then I think I know how to fix it. We would have to slow the RawIO level down to protect the memory. |
@zm711 Additionally, trying to plot in stages using chunks or using plt.subplots() instead of plt.plot() did not solve the issue. It might be that matplotlib is using a lot of memory during processing. |
How do you know that is using that much memory? How are you measuring rss? |
@h-mayorquin |
My guess is that the matplotlib is consuming many memory. What is the memory consuption when doing only this ? import neo
import numpy as np
data_file = 'example.dat'
nb_channel = 40
analog_digital_input_channels = 8
sampling_rate = 30000
reader = neo.io.RawBinarySignalIO(filename=data_file, nb_channel=nb_channel, sampling_rate=sampling_rate)
block = reader.read_block()
analog_signals = block.segments[0].analogsignals[0]
numpy_signal = analog_signals.magnitude[:, :nb_channel - analog_digital_input_channels] |
I want to read data from a dat file using RawBinarySignalIO and plot the data. When I ran the program below, it consumed more than 32GB of memory for a 1GB dat file. I want to run this process on AWS Lambda, so I need it to execute with less than 10GB of memory. Is there a way to achieve this?
The text was updated successfully, but these errors were encountered: