Skip to content

Commit

Permalink
Merge branch 'dev' into fix/link_timeseries_data
Browse files Browse the repository at this point in the history
  • Loading branch information
rly authored Oct 4, 2023
2 parents 3600079 + 734a6c4 commit fdf8f3e
Show file tree
Hide file tree
Showing 23 changed files with 368 additions and 232 deletions.
3 changes: 2 additions & 1 deletion .codespellrc
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
[codespell]
# in principle .ipynb can be corrected -- a good number of typos there
# nwb-schema -- excluding since submodule, should have its own fixes/checks
skip = .git,*.pdf,*.svg,venvs,env,*.ipynb,nwb-schema
skip = .git,*.pdf,*.svg,venvs,env,nwb-schema
ignore-regex = ^\s*"image/\S+": ".*
# it is optin in a url
# potatos - demanded to be left alone, autogenerated
ignore-words-list = optin,potatos
2 changes: 1 addition & 1 deletion .github/workflows/codespell.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ jobs:
- name: Checkout
uses: actions/checkout@v3
- name: Codespell
uses: codespell-project/actions-codespell@v1
uses: codespell-project/actions-codespell@v2
12 changes: 6 additions & 6 deletions .github/workflows/run_all_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -196,9 +196,9 @@ jobs:
fail-fast: false
matrix:
include:
- { name: linux-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: windows-python3.11-ros3, python-ver: "3.11", os: windows-latest }
- { name: macos-python3.11-ros3 , python-ver: "3.11", os: macos-latest }
- { name: conda-linux-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: conda-windows-python3.11-ros3, python-ver: "3.11", os: windows-latest }
- { name: conda-macos-python3.11-ros3 , python-ver: "3.11", os: macos-latest }
steps:
- name: Cancel non-latest runs
uses: styfle/[email protected]
Expand Down Expand Up @@ -243,9 +243,9 @@ jobs:
fail-fast: false
matrix:
include:
- { name: linux-gallery-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: windows-gallery-python3.11-ros3, python-ver: "3.11", os: windows-latest }
- { name: macos-gallery-python3.11-ros3 , python-ver: "3.11", os: macos-latest }
- { name: conda-linux-gallery-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: conda-windows-gallery-python3.11-ros3, python-ver: "3.11", os: windows-latest }
- { name: conda-macos-gallery-python3.11-ros3 , python-ver: "3.11", os: macos-latest }
steps:
- name: Cancel non-latest runs
uses: styfle/[email protected]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/run_dandi_read_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,4 +47,4 @@ jobs:
- name: Run DANDI read tests
run: |
pytest -rP tests/read_dandi/
python tests/read_dandi/test_read_dandi.py
4 changes: 2 additions & 2 deletions .github/workflows/run_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ jobs:
fail-fast: false
matrix:
include:
- { name: linux-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: conda-linux-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
steps:
- name: Cancel non-latest runs
uses: styfle/[email protected]
Expand Down Expand Up @@ -219,7 +219,7 @@ jobs:
fail-fast: false
matrix:
include:
- { name: linux-gallery-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
- { name: conda-linux-gallery-python3.11-ros3 , python-ver: "3.11", os: ubuntu-latest }
steps:
- name: Cancel non-latest runs
uses: styfle/[email protected]
Expand Down
2 changes: 1 addition & 1 deletion docs/gallery/general/plot_file.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
^^^^^^^^^^
:py:class:`~pynwb.base.TimeSeries` objects store time series data and correspond to the *TimeSeries* specifications
provided by the `NWB Format`_ . Like the NWB specification, :py:class:`~pynwb.base.TimeSeries` Python objects
provided by the `NWB Format`_. Like the NWB specification, :py:class:`~pynwb.base.TimeSeries` Python objects
follow an object-oriented inheritance pattern, i.e., the class :py:class:`~pynwb.base.TimeSeries`
serves as the base class for all other :py:class:`~pynwb.base.TimeSeries` types, such as,
:py:class:`~pynwb.ecephys.ElectricalSeries`, which itself may have further subtypes, e.g.,
Expand Down
4 changes: 2 additions & 2 deletions docs/gallery/general/scratch.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
.. note::
The scratch space is explicitly for non-standardized data that is not intended for reuse
by others. Standard NWB:N types, and extension if required, should always be used for any data that you
by others. Standard NWB types, and extension if required, should always be used for any data that you
intend to share. As such, published data should not include scratch data and a user should be able
to ignore any data stored in scratch to use a file.
Expand Down Expand Up @@ -127,7 +127,7 @@
#
# You may end up wanting to store results from some one-off analysis, and writing an extension
# to get your data into an NWBFile is too much over head. This is facilitated by the scratch space
# in NWB:N. [#]_
# in NWB. [#]_
#
# First, lets read our processed data and then make a copy

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
"source": [
"This notebook uses the convert script and API for NWB v.1.0.6 (not the current NWB 2.0 and PyNWB) to generate NWB v1.0.6 data files and compare with the current format. This notebook is mainly for comparison purposes. The corresponding notebook for converting the MeisterLab example data to NWB 2.x is available here: https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/docs/notebooks/convert-crcns-ret-1-meisterlab.ipynb .\n",
"\n",
"This example is based on https://github.com/NeurodataWithoutBorders/api-python/blob/master/examples/create_scripts/crcns_ret-1.py from H5Gate (i.e., the orignal write API for NWB v1.x). A tar file with the example data is available for download from: https://portal.nersc.gov/project/crcns/download/nwb-1/example_script_data/source_data_2.tar.gz Please download and uncompress the data file and update the paths in the *Settings* section if you want to run the notebook. "
"This example is based on https://github.com/NeurodataWithoutBorders/api-python/blob/master/examples/create_scripts/crcns_ret-1.py from H5Gate (i.e., the original write API for NWB v1.x). A tar file with the example data is available for download from: https://portal.nersc.gov/project/crcns/download/nwb-1/example_script_data/source_data_2.tar.gz Please download and uncompress the data file and update the paths in the *Settings* section if you want to run the notebook. "
]
},
{
Expand Down Expand Up @@ -71,7 +71,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# 3 Exectute convert using the original H5Gate API"
"# 3 Execute convert using the original H5Gate API"
]
},
{
Expand Down Expand Up @@ -1259,7 +1259,7 @@
"source": [
"Compared to the convert using NWB v1.0.x shown above, the NWB 2 convert example makes the following main changes:\n",
"\n",
"* NWB 2.x uses the extension mechanism to add custom data fields rather than adding unspecified custom data directly to the file, i.e., all objects (datasets, attributes, groups etc.) are governed by a formal specification. E.g., in the original script for NWB 1.0.x, pixle_size, meister_x, meister_y, meister_dx, meister_dy were stored as custom datasets in ImageSeries. For NWB 2 we create an extensions MeisterImageSeries which extens ImageSeries and stores those values as attributes pixel_size, x, y, dx, dy. For NWB 2 we chosse attributes instead of datasets simply because these are small, single int and float metadata values for which attributes are more approbirate.\n",
"* NWB 2.x uses the extension mechanism to add custom data fields rather than adding unspecified custom data directly to the file, i.e., all objects (datasets, attributes, groups etc.) are governed by a formal specification. E.g., in the original script for NWB 1.0.x, pixle_size, meister_x, meister_y, meister_dx, meister_dy were stored as custom datasets in ImageSeries. For NWB 2 we create an extensions MeisterImageSeries which extens ImageSeries and stores those values as attributes pixel_size, x, y, dx, dy. For NWB 2 we chose attributes instead of datasets simply because these are small, single int and float metadata values for which attributes are more approbirate.\n",
"* Change si_unit attribute to unit for compliance with the spec of ImageSeries \n",
"* Moved 'source' attribute from the Module to the Interface as source is not defined in the spec for modules but only for Interface\n",
"* Added missing 'source' for SpikeUnit\n",
Expand All @@ -1269,7 +1269,7 @@
"* NWBContainer is now a base type of all core neurodata_types and as such `help` and `source` attributes have been added to all core types\n",
"* The original script reused iterator variables in nested loops. We have updated those occurrence to avoid consusion and avoid possible errors. \n",
"* The following custom metadata fields---i.e., datasets that were originally added to the file without being part of the NWB specification and without creation of corresponding extensions---have not yet been integrated with the NWB files:\n",
" * /general custom metdata: /notes, /random_number_generation, /related_publications. This will require extension of NWBFile to extend the spec of /general. Improvements to make this easier have been proposed for discussion at the upcoming hackathon.\n",
" * /general custom metadata: /notes, /random_number_generation, /related_publications. This will require extension of NWBFile to extend the spec of /general. Improvements to make this easier have been proposed for discussion at the upcoming hackathon.\n",
" * SpikeUnit custom datasets with additional copies of the per-stimulus spike times (i.e., /processing/Cells/UnitTimes/cell_*/stim_* in the original version). This will require an extension for SpikeUnit.\n",
" * /subject, subject/genotype, subject/species : See Issue https://github.com/NeurodataWithoutBorders/pynwb/issues/45 support for subject metadata is upcoming in PyNWB \n",
" * /specifications, /specifications/nwb_core.py : See Issue hssue https://github.com/NeurodataWithoutBorders/pynwb/issues/44 will be added by PyNWB automatically"
Expand Down
14 changes: 7 additions & 7 deletions docs/notebooks/convert-crcns-ret-1-meisterlab.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@
"This example is based on https://github.com/NeurodataWithoutBorders/api-python/blob/master/examples/create_scripts/crcns_ret-1.py from H5Gate. \n",
"\n",
"Compared to the NWB files generated by the original example we here use the extension mechanism to add custom data fields rather than adding unspecified custom data directly to the file, i.e., all objects (datasets, attributes, groups etc.) are governed by a formal specification.\n",
"* Previously pixle_size, meister_x, meister_y, meister_dx, meister_dy were stored as custom datasets in ImageSeries. Here we create an extensions MeisterImageSeries which extens ImageSeries and stores that values as attributes pixel_size, x, y, dx, dy. We here chosse attributes instead of datasets simply because these are small, single int and float metadata values for which attributes are more approbirate.\n",
"* Previously pixle_size, meister_x, meister_y, meister_dx, meister_dy were stored as custom datasets in ImageSeries. Here we create an extensions MeisterImageSeries which extens ImageSeries and stores that values as attributes pixel_size, x, y, dx, dy. We here chose attributes instead of datasets simply because these are small, single int and float metadata values for which attributes are more approbirate.\n",
"\n",
"Compared to the NWB files generated by the original example the files generated here contain the following additional main changes:\n",
"\n",
Expand All @@ -123,7 +123,7 @@
"* NWBContainer is now a base type of all core neurodata_types and as such `help` and `source` attributes have been added to all core types\n",
"* The original script reused iterator variables in nested loops. We have updated those occurrence to avoid consusion and avoid possible errors. \n",
"* The following custom metadata fields---i.e., datasets that were originally added to the file without being part of the NWB specification and without creation of corresponding extensions---have not yet been integrated with the NWB files:\n",
" * /general custom metdata: /notes, /random_number_generation, /related_publications. This will require extension of NWBFile to extend the spec of /general. Improvements to make this easier have been proposed for discussion at the upcoming hackathon.\n",
" * /general custom metadata: /notes, /random_number_generation, /related_publications. This will require extension of NWBFile to extend the spec of /general. Improvements to make this easier have been proposed for discussion at the upcoming hackathon.\n",
" * SpikeUnit custom datasets with additional copies of the per-stimulus spike times (i.e., /processing/Cells/UnitTimes/cell_*/stim_* in the original version). This will require an extension for SpikeUnit.\n",
" * /subject, subject/genotype, subject/species : See Issue https://github.com/NeurodataWithoutBorders/pynwb/issues/45 support for subject metadata is upcoming in PyNWB \n",
" * /specifications, /specifications/nwb_core.py : See Issue https://github.com/NeurodataWithoutBorders/pynwb/issues/44 will be added by PyNWB automatically\n",
Expand Down Expand Up @@ -591,7 +591,7 @@
"# Build the namespace\n",
"ns_builder = NWBNamespaceBuilder('Extension for use in my Lab', ns_name)\n",
"\n",
"# Create a custom ImageSeries to add our custom attributes and add our extenions to the namespace\n",
"# Create a custom ImageSeries to add our custom attributes and add our extensions to the namespace\n",
"mis_ext = NWBGroupSpec('A custom ImageSeries to add MeisterLab custom metadata',\n",
" attributes=[NWBAttributeSpec('x' , 'int', 'meister x', required=False),\n",
" NWBAttributeSpec('y' , 'int', 'meister y', required=False),\n",
Expand Down Expand Up @@ -697,7 +697,7 @@
"metadata": {},
"source": [
"We can now inspect our container class using the usual mechanisms, e.g., help. For illustration purposes, let's call help on our class. Here we can see that:\n",
"* Our custom attributes have been added to the constructor with approbriate documention describing the type and purpose we indicated in the spec for our attributes\n",
"* Our custom attributes have been added to the constructor with approbriate documentation describing the type and purpose we indicated in the spec for our attributes\n",
"* From the \"Method resolution order\" documentationw we can see that our MeisterImageSeries inherits from pynwb.image.ImageSeries so that interaction mechanism from the base class are also available in our class"
]
},
Expand Down Expand Up @@ -746,7 +746,7 @@
" | bits_per_pixel (int): Number of bit per image pixel\n",
" | dimension (Iterable): Number of pixels on x, y, (and z) axes.\n",
" | resolution (float): The smallest meaningful difference (in specified unit) between values in data\n",
" | conversion (float): Scalar to multiply each element by to conver to volts\n",
" | conversion (float): Scalar to multiply each element by to convert to volts\n",
" | timestamps (ndarray or list or tuple or Dataset or DataChunkIterator or DataIO or TimeSeries): Timestamps for samples stored in data\n",
" | starting_time (float): The timestamp of the first sample\n",
" | rate (float): Sampling rate in Hz\n",
Expand Down Expand Up @@ -885,7 +885,7 @@
"def convert_single_file(file_stimulus_data, file_meta, spike_units, electrode_meta):\n",
" import h5py\n",
" #########################################\n",
" # Create the NWBFile containter\n",
" # Create the NWBFile container\n",
" ##########################################\n",
" nwbfile = NWBFile(session_description=file_meta['description'],\n",
" identifier=file_meta['identifier'],\n",
Expand Down Expand Up @@ -1004,7 +1004,7 @@
"source": [
"## Step 5.2: Convert all files\n",
"\n",
"Convert all the files by iteating over the files and calling `convert_single_file` function for each of the file"
"Convert all the files by iterating over the files and calling `convert_single_file` function for each of the file"
]
},
{
Expand Down
Loading

0 comments on commit fdf8f3e

Please sign in to comment.