You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This commit was created on GitHub.com and signed with GitHub’s verified signature.
The key has expired.
v0.4.6
Back-compatability break
Changed the metadata schema for Fluorescence and DfOverF where the traces metadata should be provided as a dict instead of a list of dicts. The name of the plane segmentation is used to determine which traces to add to the Fluorescence and DfOverF containers. PR #632
Features
Added Pydantic data models of BackendConfiguration for both HDF5 and Zarr datasets (container/mapper of all the DatasetConfigurations for a particular file). PR #568
Modify the filtering of traces to also filter out traces with empty values. PR #649
Added tool function get_default_dataset_configurations for identifying and collecting all fields of an in-memory NWBFile that could become datasets on disk; and return instances of the Pydantic dataset models filled with default values for chunking/buffering/compression. PR #569
Added tool function get_default_backend_configuration for conveniently packaging the results of get_default_dataset_configurations into an easy-to-modify mapping from locations of objects within the file to their correseponding dataset configuration options, as well as linking to a specific backend DataIO. PR #570
Added set_probe() method to BaseRecordingExtractorInterface. PR #639
Fixes
Fixed GenericDataChunkIterator (in hdmf.py) in the case where the number of dimensions is 1 and the size in bytes is greater than the threshold of 1 GB. PR #638
Changed np.floor and np.prod usage to math.floor and math.prod in various files. PR #638
Updated minimal required version of DANDI CLI; updated run_conversion_from_yaml API function and tests to be compatible with naming changes. PR #664
Improvements
Change metadata extraction library from fparse to parse. PR #654
The dandi CLI/API is now an optional dependency; it is still required to use the tool function for automated upload as well as the YAML-based NeuroConv CLI. PR #655