-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test suite failure on HDMF 3.14.4 #1494
Comments
so you are saying that with the new 3.14.4 something what worked before starts to fail? sounds like a possible regression in HDMF ... and indeed our CI is full of red Test fails since awhile, but largely masked by facts that I was aware of some other services running Red and thus ignoring... ok, time to look into it! That test started to fail in 08 on 22nd and on dev-deps which install development versions of hdmf and pynwb (and keyring and nwbinspector) at https://github.com/dandi/dandi-cli/blob/master/.github/workflows/test.yml#L71
first non-dev deps fail happened on Sep 05
with hdmf
whenever prior day had
so indeed hdmf release of changes from around aug 22 seems to blame ;) so it is likely of those 3 changescommit e0bedca13f167d55a4be5657044c4c6697de95ca
Author: Matthew Avaylon <[email protected]>
Date: Thu Aug 22 08:45:29 2024 -0700
Append a Dataset of References (#1135)
commit acc3d78cc5a828ddd384cca814ef60167ae92682
Author: Steph Prince <[email protected]>
Date: Wed Aug 21 22:14:24 2024 -0700
Write scalar datasets with compound data type (#1176)
* add support for scalar compound datasets
* add scalar compound dset io and validation tests
* update CHANGELOG.md
* Update tests/unit/test_io_hdf5_h5tools.py
Co-authored-by: Ryan Ly <[email protected]>
* update container repr conditionals
---------
Co-authored-by: Ryan Ly <[email protected]>
commit 2b167aedc8a8f58afd75d3d0c750f6d620dc663d
Author: Steph Prince <[email protected]>
Date: Wed Aug 21 13:48:42 2024 -0700
Add support to write multidimensional string arrays (#1173)
* add condition for multidim string arrays
* add tests for multidim string array build
* update condition when defining hdf5 dataset shape
* add test to write multidim string array
* update CHANGELOG.md
* fix text decoding in test
* add recursive string type for arrays of arbitrary dim
* add test for compound data type with strings
* add tests for multidim str attributes
* fix line lengths
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* update compound dtype test
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Ryan Ly <[email protected]>
since not clear what it is -- now will look into actual test fail and see if can point more specifically the effect |
bisected to hdmf-dev/hdmf@2b167ae |
it is about
and related to creation of that file, not loading etc... And since it complained only about that one, and not the other one -- it is about us "copying" an nwb via NWBHDF5IO in https://github.com/dandi/dandi-cli/blob/HEAD/dandi/pynwb_utils.py#L494 . with pynwb.NWBHDF5IO(src, "r") as ior, pynwb.NWBHDF5IO(dest, "w") as iow:
data = ior.read()
data.generate_new_id()
iow.export(ior, nwbfile=data) as a result of that commit, we are getting a new file... and it as 12 more of lines in the diff between diffs of h5dump between original and "copied":
and that ❯ grep StrDataset /home/yoh/.tmp/pytest-of-yoh/pytest-785/simple20/simple2.dump /home/yoh/.tmp/pytest-of-yoh/pytest-785/test_ambiguous0/simple2.dump
/home/yoh/.tmp/pytest-of-yoh/pytest-785/test_ambiguous0/simple2.dump: (0): "<StrDataset for HDF5 dataset "keywords": shape (2,), type "|O">" is quite a unique "thing" to be used at all present only in the "copy" of the file. ... and damn -- we are managing to hide away such an error message/exception from the user even at debug level somehow, but there is a problem with HDMF being unable to load such a file
I will for now declare this version of hdmf buggy for dandi-cli and file and issue with HDMF. |
joblib does not setup automagically some kind of logging for Parallel. Filed dedicated to possibly see it implemented - #1495 For the sake of current use case (e.g. troubleshooting #1494) it should largely suffice to return and log information about exception which was raised while loading metadata. This is what is done in this PR and while using buggy hdmf we do get nice logging in the log file at DEBUG level. No attempts were made to reduce possibly a flood of duplicate log messages since per file metadata would have unique values
🚀 Issue was released in |
Found this in the NWB Inspector
We use the following sequence in the actions: https://github.com/NeurodataWithoutBorders/nwbinspector/blob/a7e7172d1e1cabcba703c0c1ada343ab0864f79a/.github/workflows/dandi-dev.yml#L26-L35, usually minus line 34
But without line 34 (forcible pin to previous HDMF version) this results in an error in
dandi/tests/test_organize.py:test_ambiguous
: https://github.com/NeurodataWithoutBorders/nwbinspector/actions/runs/10712486753/job/29702975698Unsure what the true source of the problem is though, and if its a problem to be concerned about in general, just wanted to let y'all know
The text was updated successfully, but these errors were encountered: