-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while reading a dandiset using NWBHDF5IO that has ImagingVolume #126
Comments
attn: @rly Perhaps an extension issue? |
Hi @craterkamath , the issue appears to be with the NWB file. The spec for {
"neurodata_type_def": "ImagingVolume",
"neurodata_type_inc": "ImagingPlane",
"doc": "An Imaging Volume and its Metadata",
"groups": [
{
"doc": "An optical channel used to record from an imaging volume",
"quantity": "*",
"neurodata_type_inc": "OpticalChannelPlus"
},
{
"doc": "Ordered list of names of the optical channels in the data",
"name": "order_optical_channels",
"neurodata_type_inc": "OpticalChannelReferences"
}
]
}, which specifies that @dysprague Did you create these files? I think the ones with
I know these are big files so doing both options may be best. (1 for fixing this in future runs of the script and 2 to fix existing files quickly). I can help with either option. attn @oruebel @bendichter in case you see this issue elsewhere |
Hi @rly Thanks for the help on this. What you're saying mostly makes sense to me but I had a few questions. I am able to open these files completely fine on my own laptop. There is also dandiest 000692 created by Kotaro Kimura using the same spec which I am also able to open fine. The other thing that is confusing to me is that in the Spec, the MultiChannelVolume object also has a subgroup 'order_optical_channels' which is defined and set in the exact same way as it is for the ImagingVolume, so I'm not sure why the error is only being thrown for the ImagingVolume object. When creating the 'ImagingVolume' object, how would I add the OpticalChannelsReferences object as a subgroup rather than as a link? I can definitely update the code used to generate these files, but as you said these files are large so might be better to perform targeted data updates rather than fully regenerating files. I would appreciate some help figuring out how to do that. Thanks, |
To follow up, @dysprague and I connected over Slack. @dysprague adjusted the script and ndx-multichannel-volume extension used to generate the files. I wrote a script to do the following data surgery steps for existing files:
The next steps are to run a script that checks each NWB file in dandiset 000776 for this issue and for each of those files, download the file, run the above script, and re-upload the file. We will also want to adjust the NWB files in dandisets 000715, 000565, 000541, 000472, 000714, and possibly 000692. This PR in HDMF hdmf-dev/hdmf#1050 will catch these errors in the future during validation. I opened an issue in HDMF hdmf-dev/hdmf#1051 to note that these name mismatch issues, or more generally, all validation issues, should raise an error on write. |
Thanks @rly and @dysprague . Hoping to see the updated dataset on dandi Archive soon! |
Bug description
I'm trying to read stream/download dandisets from dandihub and the ones that have ImagingVolume, for example DANDI:000776, throw the below error:
The error is not present when I try to read other datasets that do not have ImagingVolumes.
How to reproduce
I'm using the below code to read the dandiset
Your personal set up
OS:
My package versions are below:
Python environment to reproduce:
The text was updated successfully, but these errors were encountered: