How to truncate an electrical series and preserve compression? #88
Unanswered
DailyDreaming
asked this question in
Q&A
Replies: 3 comments
-
Yes, in HDF5 if you delete an old dataset and create a new one, the old dataset still takes up space. What you should do instead is: with h5py.File("original.raw.h5.nwb", "r+") as f:
trimmed_data = f["acquisition"]['ElectricalSeries']['data'][1400000:, :952]
f["acquisition"]['ElectricalSeries']['data'][:-1400000, :952] = trimmed_data
f["acquisition"]['ElectricalSeries']['data'].resize((1400000, 952)) This will only work if the dataset is chunked to begin with (which is true in your case) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Make sure to update your timestamps or starting_time accordingly! |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have several NWB files that have an electrical series where I'd like to trim off the first noisy 70 seconds or so. What is the best way of doing this?
The only reliable way I've found so far is using h5py and reshaping the dataset:
However, the resulting dataset results in a file that was ~13Gb ballooning to ~30Gb and when inspected, the dataset is uncompressed (
compression=None
). Is there a way modify the dataset to still have gzip compression?Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions