Skip to content

Commit

Permalink
Merge branch 'dev' into streaming_add_remfile2
Browse files Browse the repository at this point in the history
  • Loading branch information
rly authored Jan 13, 2024
2 parents bae8d30 + 1529028 commit 7974499
Show file tree
Hide file tree
Showing 16 changed files with 136 additions and 87 deletions.
25 changes: 10 additions & 15 deletions .github/workflows/run_dandi_read_tests.yml
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
name: Run DANDI read tests
on:
schedule:
- cron: '0 6 * * *' # once per day at 1am ET
# NOTE this is disabled until we can run this systematically instead of randomly
# so we don't get constant error notifications and waste compute cycles
# See https://github.com/NeurodataWithoutBorders/pynwb/issues/1804
# schedule:
# - cron: '0 6 * * *' # once per day at 1am ET
workflow_dispatch:

jobs:
run-tests:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0} # necessary for conda
steps:
- name: Cancel non-latest runs
uses: styfle/[email protected]
Expand All @@ -22,19 +22,14 @@ jobs:
submodules: 'recursive'
fetch-depth: 0 # tags are required for versioneer to determine the version

- name: Set up Conda
uses: conda-incubator/setup-miniconda@v2
- name: Set up Python
uses: actions/setup-python@v4
with:
auto-update-conda: true
activate-environment: ros3
environment-file: environment-ros3.yml
python-version: "3.11"
channels: conda-forge
auto-activate-base: false
python-version: '3.11'

- name: Install run dependencies
run: |
python -m pip install dandi pytest
python -m pip install dandi fsspec requests aiohttp pytest
python -m pip uninstall -y pynwb # uninstall pynwb
python -m pip install -e .
python -m pip list
Expand All @@ -47,4 +42,4 @@ jobs:
- name: Run DANDI read tests
run: |
python tests/read_dandi/test_read_dandi.py
python tests/read_dandi/read_dandi.py
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,15 @@
- For `NWBHDF5IO()`, change the default of arg `load_namespaces` from `False` to `True`. @bendichter [#1748](https://github.com/NeurodataWithoutBorders/pynwb/pull/1748)
- Add `NWBHDF5IO.can_read()`. @bendichter [#1703](https://github.com/NeurodataWithoutBorders/pynwb/pull/1703)
- Add `pynwb.get_nwbfile_version()`. @bendichter [#1703](https://github.com/NeurodataWithoutBorders/pynwb/pull/1703)
- Updated timeseries data checks to warn instead of error when reading invalid files. @stephprince [#1793](https://github.com/NeurodataWithoutBorders/pynwb/pull/1793)
- Updated timeseries data checks to warn instead of error when reading invalid files. @stephprince [#1793](https://github.com/NeurodataWithoutBorders/pynwb/pull/1793) and [#1809](https://github.com/NeurodataWithoutBorders/pynwb/pull/1809)
- Expose the offset, conversion and channel conversion parameters in `mock_ElectricalSeries`. @h-mayorquin [#1796](https://github.com/NeurodataWithoutBorders/pynwb/pull/1796)
- Expose `starting_time` in `mock_ElectricalSeries`. @h-mayorquin [#1805](https://github.com/NeurodataWithoutBorders/pynwb/pull/1805)
- Enhance `get_data_in_units()` to work with objects that have a `channel_conversion` attribute like the `ElectricalSeries`. @h-mayorquin [#1806](https://github.com/NeurodataWithoutBorders/pynwb/pull/1806)
- Refactor validation CLI tests to use `{sys.executable} -m coverage` to use the same Python version and run correctly on Debian systems. @yarikoptic [#1811](https://github.com/NeurodataWithoutBorders/pynwb/pull/1811)

### Bug fixes
- Fix bug where namespaces were loaded in "w-" mode. @h-mayorquin [#1795](https://github.com/NeurodataWithoutBorders/pynwb/pull/1795)
- Fix bug where pynwb version was reported as "unknown" to readthedocs @stephprince [#1810](https://github.com/NeurodataWithoutBorders/pynwb/pull/1810)

### Documentation and tutorial enhancements
- Add RemFile to streaming tutorial @bendichter [#1761](https://github.com/NeurodataWithoutBorders/pynwb/pull/1761)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/install_users.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ This will automatically install the following required dependencies:
Install release from Conda-forge
--------------------------------

`Conda-forge <https://conda-forge.org/#about>`_ is a community led collection of recipes, build infrastructure
`Conda-forge <https://conda-forge.org>`_ is a community led collection of recipes, build infrastructure
and distributions for the `conda <https://conda.io/docs/>`_ package manager.

To install or update PyNWB distribution from conda-forge using conda simply run:
Expand Down
4 changes: 2 additions & 2 deletions docs/source/software_process.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,15 @@ tested on all supported operating systems and python distributions. That way,
as a contributor, you know if you introduced regressions or coding style
inconsistencies.

There are badges in the :pynwb:`README <#readme>` file which shows
There are badges in the :pynwb:`README <blob/dev/README.rst>` file which shows
the current condition of the dev branch.

--------
Coverage
--------

Code coverage is computed and reported using the coverage_ tool. There are two coverage-related
badges in the :pynwb:`README <#readme>` file. One shows the status of the :pynwb:`GitHub Action workflow <actions?query=workflow%3A%22Run+coverage%22>` which runs the coverage_ tool and uploads the report to
badges in the :pynwb:`README <blob/dev/README.rst>` file. One shows the status of the :pynwb:`GitHub Action workflow <actions?query=workflow%3A%22Run+coverage%22>` which runs the coverage_ tool and uploads the report to
codecov_, and the other badge shows the percentage coverage reported from codecov_. A detailed report can be found on
codecov_, which shows line by line which lines are covered by the tests.

Expand Down
2 changes: 1 addition & 1 deletion environment-ros3.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ dependencies:
- pandas==2.0.0
- python-dateutil==2.8.2
- setuptools
- dandi==0.55.1 # NOTE: dandi does not support osx-arm64
- dandi==0.59.0 # NOTE: dandi does not support osx-arm64
- fsspec==2023.6.0
- requests==2.28.1
- aiohttp==3.8.3
Expand Down
3 changes: 2 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ VCS = git
versionfile_source = src/pynwb/_version.py
versionfile_build = pynwb/_version.py
tag_prefix = ''
style = pep440-pre

[flake8]
max-line-length = 120
Expand All @@ -28,7 +29,7 @@ per-file-ignores =
tests/integration/__init__.py:F401
src/pynwb/testing/__init__.py:F401
src/pynwb/validate.py:T201
tests/read_dandi/test_read_dandi.py:T201
tests/read_dandi/read_first_nwb_asset.py:T201
setup.py:T201
test.py:T201
scripts/*:T201
Expand Down
3 changes: 2 additions & 1 deletion src/pynwb/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,8 @@ def __init__(self, **kwargs):
popargs('path', 'mode', 'manager', 'extensions', 'load_namespaces',
'file', 'comm', 'driver', 'herd_path', kwargs)
# Define the BuildManager to use
if mode in 'wx' or manager is not None or extensions is not None:
io_modes_that_create_file = ['w', 'w-', 'x']
if mode in io_modes_that_create_file or manager is not None or extensions is not None:
load_namespaces = False

if load_namespaces:
Expand Down
2 changes: 1 addition & 1 deletion src/pynwb/_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def get_config():
cfg = VersioneerConfig()
cfg.VCS = "git"
cfg.style = "pep440-pre"
cfg.tag_prefix = "*.*.*"
cfg.tag_prefix = ""
cfg.parentdir_prefix = "None"
cfg.versionfile_source = "src/pynwb/_version.py"
cfg.verbose = False
Expand Down
12 changes: 9 additions & 3 deletions src/pynwb/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -187,10 +187,12 @@ def __init__(self, **kwargs):
if isinstance(timestamps, TimeSeries):
timestamps.__add_link('timestamp_link', self)
elif self.rate is not None:
if self.rate <= 0:
if self.rate < 0:
self._error_on_new_warn_on_construct(
error_msg='Rate must be a positive value.'
error_msg='Rate must not be a negative value.'
)
elif self.rate == 0.0 and get_data_shape(data)[0] > 1:
warn('Timeseries has a rate of 0.0 Hz, but the length of the data is greater than 1.')
if self.starting_time is None: # override default if rate is provided but not starting time
self.starting_time = 0.0
self.starting_time_unit = self.__time_unit
Expand Down Expand Up @@ -296,7 +298,11 @@ def get_timestamps(self):
return np.arange(len(self.data)) / self.rate + self.starting_time

def get_data_in_units(self):
return np.asarray(self.data) * self.conversion + self.offset
if "channel_conversion" in self.fields:
scale_factor = self.conversion * self.channel_conversion[:, np.newaxis]
else:
scale_factor = self.conversion
return np.asarray(self.data) * scale_factor + self.offset


@register_class('Image', CORE_NAMESPACE)
Expand Down
10 changes: 9 additions & 1 deletion src/pynwb/testing/mock/ecephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,18 +71,26 @@ def mock_ElectricalSeries(
data=None,
rate: float = 30000.0,
timestamps=None,
starting_time: Optional[float] = None,
electrodes: Optional[DynamicTableRegion] = None,
filtering: str = "filtering",
nwbfile: Optional[NWBFile] = None
nwbfile: Optional[NWBFile] = None,
channel_conversion: Optional[np.ndarray] = None,
conversion: float = 1.0,
offset: float = 0.,
) -> ElectricalSeries:
electrical_series = ElectricalSeries(
name=name or name_generator("ElectricalSeries"),
description=description,
data=data if data is not None else np.ones((10, 5)),
rate=rate,
starting_time=starting_time,
timestamps=timestamps,
electrodes=electrodes or mock_electrodes(nwbfile=nwbfile),
filtering=filtering,
conversion=conversion,
offset=offset,
channel_conversion=channel_conversion,
)

if nwbfile is not None:
Expand Down
3 changes: 3 additions & 0 deletions src/pynwb/testing/mock/ophys.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,7 @@ def mock_OnePhotonSeries(
conversion=conversion,
timestamps=timestamps,
starting_time=starting_time,
offset=offset,
rate=rate,
comments=comments,
description=description,
Expand Down Expand Up @@ -162,6 +163,7 @@ def mock_TwoPhotonSeries(
dimension=None,
resolution=-1.0,
conversion=1.0,
offset=0.0,
timestamps=None,
starting_time=None,
comments="no comments",
Expand Down Expand Up @@ -194,6 +196,7 @@ def mock_TwoPhotonSeries(
control=control,
control_description=control_description,
device=device,
offset=offset,
)

if nwbfile is not None:
Expand Down
15 changes: 15 additions & 0 deletions tests/integration/hdf5/test_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import numpy as np
from h5py import File
from pathlib import Path
import tempfile

from pynwb import NWBFile, TimeSeries, get_manager, NWBHDF5IO, validate

Expand All @@ -14,6 +15,7 @@
from pynwb.spec import NWBGroupSpec, NWBDatasetSpec, NWBNamespace
from pynwb.ecephys import ElectricalSeries, LFP
from pynwb.testing import remove_test_file, TestCase
from pynwb.testing.mock.file import mock_NWBFile


class TestHDF5Writer(TestCase):
Expand Down Expand Up @@ -122,6 +124,19 @@ def test_write_no_cache_spec(self):
with File(self.path, 'r') as f:
self.assertNotIn('specifications', f)

def test_file_creation_io_modes(self):
io_modes_that_create_file = ["w", "w-", "x"]

with tempfile.TemporaryDirectory() as temp_dir:
temp_dir = Path(temp_dir)
for io_mode in io_modes_that_create_file:
file_path = temp_dir / f"test_io_mode={io_mode}.nwb"

# Test file creation
nwbfile = mock_NWBFile()
with NWBHDF5IO(str(file_path), io_mode) as io:
io.write(nwbfile)


class TestHDF5WriterWithInjectedFile(TestCase):

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
"""Test reading NWB files from the DANDI Archive using ROS3."""
"""Test reading NWB files from the DANDI Archive using fsspec."""
from dandi.dandiapi import DandiAPIClient
import fsspec
import h5py
import random
import sys
import traceback
Expand All @@ -10,16 +12,21 @@
# NOTE: do not name the function with "test_" prefix, otherwise pytest
# will try to run it as a test

# TODO read dandisets systematically, not randomly
# see https://github.com/NeurodataWithoutBorders/pynwb/issues/1804

def read_first_nwb_asset():
"""Test reading the first NWB asset from a random selection of 50 dandisets that uses NWB."""
num_dandisets_to_read = 50
"""Test reading the first NWB asset from a random selection of 2 dandisets that uses NWB."""
num_dandisets_to_read = 2
client = DandiAPIClient()
dandisets = list(client.get_dandisets())
random.shuffle(dandisets)
dandisets_to_read = dandisets[:num_dandisets_to_read]
print("Reading NWB files from the following dandisets:")
print([d.get_raw_metadata()["identifier"] for d in dandisets_to_read])

fs = fsspec.filesystem("http")

failed_reads = dict()
for i, dandiset in enumerate(dandisets_to_read):
dandiset_metadata = dandiset.get_raw_metadata()
Expand Down Expand Up @@ -47,8 +54,10 @@ def read_first_nwb_asset():
s3_url = first_asset.get_content_url(follow_redirects=1, strip_query=True)

try:
with NWBHDF5IO(path=s3_url, driver="ros3") as io:
io.read()
with fs.open(s3_url, "rb") as f:
with h5py.File(f) as file:
with NWBHDF5IO(file=file) as io:
io.read()
except Exception as e:
print(traceback.format_exc())
failed_reads[dandiset] = e
Expand Down
10 changes: 6 additions & 4 deletions tests/unit/test_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -406,10 +406,12 @@ def test_get_data_in_units(self):
assert_array_equal(ts.get_data_in_units(), [1., 2., 3.])

def test_non_positive_rate(self):
with self.assertRaisesWith(ValueError, 'Rate must be a positive value.'):
with self.assertRaisesWith(ValueError, 'Rate must not be a negative value.'):
TimeSeries(name='test_ts', data=list(), unit='volts', rate=-1.0)
with self.assertRaisesWith(ValueError, 'Rate must be a positive value.'):
TimeSeries(name='test_ts1', data=list(), unit='volts', rate=0.0)

with self.assertWarnsWith(UserWarning,
'Timeseries has a rate of 0.0 Hz, but the length of the data is greater than 1.'):
TimeSeries(name='test_ts1', data=[1, 2, 3], unit='volts', rate=0.0)

def test_file_with_non_positive_rate_in_construct_mode(self):
"""Test that UserWarning is raised when rate is 0 or negative
Expand All @@ -419,7 +421,7 @@ def test_file_with_non_positive_rate_in_construct_mode(self):
parent=None,
object_id="test",
in_construct_mode=True)
with self.assertWarnsWith(warn_type=UserWarning, exc_msg='Rate must be a positive value.'):
with self.assertWarnsWith(warn_type=UserWarning, exc_msg='Rate must not be a negative value.'):
obj.__init__(
name="test_ts",
data=list(),
Expand Down
19 changes: 19 additions & 0 deletions tests/unit/test_ecephys.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
from pynwb.device import Device
from pynwb.file import ElectrodeTable
from pynwb.testing import TestCase
from pynwb.testing.mock.ecephys import mock_ElectricalSeries

from hdmf.common import DynamicTableRegion

Expand Down Expand Up @@ -115,6 +116,24 @@ def test_dimensions_warning(self):
"but instead the first does. Data is oriented incorrectly and should be transposed."
) in str(w[-1].message)

def test_get_data_in_units(self):

data = np.asarray([[1, 1, 1, 1, 1], [1, 1, 1, 1, 1]])
conversion = 1.0
offset = 3.0
channel_conversion = np.asarray([2.0, 2.0])
electrical_series = mock_ElectricalSeries(
data=data,
conversion=conversion,
offset=offset,
channel_conversion=channel_conversion,
)

data_in_units = electrical_series.get_data_in_units()
expected_data = data * conversion * channel_conversion[:, np.newaxis] + offset

np.testing.assert_almost_equal(data_in_units, expected_data)


class SpikeEventSeriesConstructor(TestCase):

Expand Down
Loading

0 comments on commit 7974499

Please sign in to comment.