-
Notifications
You must be signed in to change notification settings - Fork 190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EDFRecordingExtractor can not be loaded from dict while still open #1228
Comments
This seems to be a problem coming to neo through from pathlib import Path
from pyedflib import EdfReader
file_path = Path("/home/heberto/neuroconv_testing_data/ephy_testing_data/edf/edf+C.edf")
reader = EdfReader(str(file_path))
reader2 = EDFRecordingExtractor(file_path=file_path) Note that this works well in separate process though. The following does not throw an assertion: from pathlib import Path
from pyedflib import EdfReader
file_path = Path("/home/heberto/neuroconv_testing_data/ephy_testing_data/edf/edf+C.edf")
from concurrent.futures import ProcessPoolExecutor
n_jobs = 8
file_path_list = [(i, file_path) for i in range(n_jobs)]
def initializer(file_path):
number, file_path = file_path
print(number, file_path)
reader = EdfReader(str(file_path))
print(reader)
with ProcessPoolExecutor(max_workers=n_jobs // 2) as executor:
results = executor.map(initializer, file_path_list) |
Thanks for this. |
@h-mayorquin can we close this and in case move to NEO? |
Yes, I will be fine with this. We should open first the issue in neo and then close this though so we don't forget. |
OK, I opened the issue in neo: NeuralEnsemble/python-neo#1557 Let's close this. |
I discovered this while working on #1227. The following code throws an assertion:
So it seems that this extractor can not mantain two open references to the same file. I wonder what should we do in this cases? Should the recorder at neo or spikeinterface throw an assertion if the file is already open informing the user that they have to close the old one?
The text was updated successfully, but these errors were encountered: