Skip to content

Commit

Permalink
Merge branch 'main' into binary_recording_limit_get_traces_allocation
Browse files Browse the repository at this point in the history
  • Loading branch information
h-mayorquin committed Sep 11, 2023
2 parents 2ea7f1b + a26cb84 commit 37bb8f6
Show file tree
Hide file tree
Showing 292 changed files with 9,164 additions and 7,595 deletions.
13 changes: 5 additions & 8 deletions .github/actions/build-test-environment/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,15 @@ runs:
pip install -e .[test,extractors,full]
shell: bash
- name: Force installation of latest dev from key-packages when running dev (not release)
id: version
run: |
source ${{ github.workspace }}/test_env/bin/activate
if python ./.github/is_spikeinterface_dev.py; then
spikeinterface_is_dev_version=$(python -c "import importlib.metadata; version = importlib.metadata.version('spikeinterface'); print(version.endswith('dev0'))")
if [ $spikeinterface_is_dev_version = "True" ]; then
echo "Running spikeinterface dev version"
pip uninstall -y neo
pip uninstall -y probeinterface
pip install git+https://github.com/NeuralEnsemble/python-neo
pip install git+https://github.com/SpikeInterface/probeinterface
else
echo "Running tests for release"
pip install --no-cache-dir git+https://github.com/NeuralEnsemble/python-neo
pip install --no-cache-dir git+https://github.com/SpikeInterface/probeinterface
fi
echo "Running tests for release, using pyproject.toml versions of neo and probeinterface"
shell: bash
- name: git-annex install
run: |
Expand Down
6 changes: 0 additions & 6 deletions .github/is_spikeinterface_dev.py

This file was deleted.

4 changes: 3 additions & 1 deletion .github/workflows/caches_cron_job.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ jobs:
with:
path: ${{ github.workspace }}/test_env
key: ${{ runner.os }}-venv-${{ steps.dependencies.outputs.hash }}-${{ steps.date.outputs.date }}
lookup-only: 'true' # Avoids downloading the data, saving behavior is not affected.
- name: Cache found?
run: echo "Cache-hit == ${{steps.cache-venv.outputs.cache-hit == 'true'}}"
- name: Create the virtual environment to be cached
Expand Down Expand Up @@ -64,6 +65,7 @@ jobs:
with:
path: ~/spikeinterface_datasets
key: ${{ runner.os }}-datasets-${{ steps.repo_hash.outputs.dataset_hash }}
lookup-only: 'true' # Avoids downloading the data, saving behavior is not affected.
- name: Cache found?
run: echo "Cache-hit == ${{steps.cache-datasets.outputs.cache-hit == 'true'}}"
- name: Installing datalad and git-annex
Expand All @@ -88,7 +90,7 @@ jobs:
run: |
cd $HOME
pwd
du -hs spikeinterface_datasets
du -hs spikeinterface_datasets # Should show the size of ephy_testing_data
cd spikeinterface_datasets
pwd
ls -lh # Should show ephy_testing_data
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/full-test-with-codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,6 @@ jobs:
with:
path: ${{ github.workspace }}/test_env
key: ${{ runner.os }}-venv-${{ hashFiles('**/pyproject.toml') }}-${{ steps.date.outputs.date }}
restore-keys: |
${{ runner.os }}-venv-
- name: Get ephy_testing_data current head hash
# the key depends on the last comit repo https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git
id: vars
Expand All @@ -48,8 +46,7 @@ jobs:
with:
path: ~/spikeinterface_datasets
key: ${{ runner.os }}-datasets-${{ steps.vars.outputs.HASH_EPHY_DATASET }}
restore-keys: |
${{ runner.os }}-datasets
restore-keys: ${{ runner.os }}-datasets
- name: Install packages
uses: ./.github/actions/build-test-environment
- name: Shows installed packages by pip, git-annex and cached testing files
Expand Down
9 changes: 5 additions & 4 deletions .github/workflows/full-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,6 @@ jobs:
with:
path: ${{ github.workspace }}/test_env
key: ${{ runner.os }}-venv-${{ hashFiles('**/pyproject.toml') }}-${{ steps.date.outputs.date }}
restore-keys: |
${{ runner.os }}-venv-
- name: Get ephy_testing_data current head hash
# the key depends on the last comit repo https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git
id: vars
Expand All @@ -53,8 +51,7 @@ jobs:
with:
path: ~/spikeinterface_datasets
key: ${{ runner.os }}-datasets-${{ steps.vars.outputs.HASH_EPHY_DATASET }}
restore-keys: |
${{ runner.os }}-datasets
restore-keys: ${{ runner.os }}-datasets
- name: Install packages
uses: ./.github/actions/build-test-environment
- name: Shows installed packages by pip, git-annex and cached testing files
Expand All @@ -66,6 +63,10 @@ jobs:
id: modules-changed
run: |
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
if [[ $file == *"pyproject.toml" ]]; then
echo "pyproject.toml changed"
echo "CORE_CHANGED=true" >> $GITHUB_OUTPUT
fi
if [[ $file == *"/core/"* || $file == *"/extractors/neoextractors/neobaseextractor.py" ]]; then
echo "Core changed"
echo "CORE_CHANGED=true" >> $GITHUB_OUTPUT
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/publish-to-pypi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@ jobs:
pip install pytest
pip install zarr
pip install setuptools wheel twine build
pip install -e .
pip install -e .[test_core]
- name: Test core with pytest
run: |
pytest -v spikeinterface/core
pytest -v src/spikeinterface/core
- name: Publish on PyPI
env:
TWINE_USERNAME: __token__
Expand Down
19 changes: 17 additions & 2 deletions .github/workflows/streaming-extractor-test.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
name: Test streaming extractors

on:
pull_request:
types: [synchronize, opened, reopened]
branches:
- main

concurrency: # Cancel previous workflows on the same pull request
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
Expand All @@ -30,14 +32,27 @@ jobs:
- run: git fetch --prune --unshallow --tags
- name: Install openblas
run: sudo apt install libopenblas-dev # Necessary for ROS3 support
- name: Install package and streaming extractor dependencies
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v35
- name: Module changes
id: modules-changed
run: |
pip install -e .[test_core,streaming_extractors]
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
if [[ $file == *"/nwbextractors.py" || $file == *"/iblstreamingrecording.py"* ]]; then
echo "Streaming files changed changed"
echo "STREAMING_CHANGED=true" >> $GITHUB_OUTPUT
fi
done
- name: Install package and streaming extractor dependencies
if: ${{ steps.modules-changed.outputs.STREAMING_CHANGED == 'true' }}
run: pip install -e .[test_core,streaming_extractors]
# Temporary disabled because of complicated error with path
# - name: Install h5py with ROS3 support and test it works
# run: |
# pip uninstall -y h5py
# conda install -c conda-forge "h5py>=3.2"
# python -c "import h5py; assert 'ros3' in h5py.registered_drivers(), f'ros3 suppport not available, failed to install'"
- name: run tests
if: steps.modules-changed.outputs.STREAMING_CHANGED == 'true'
run: pytest -m "streaming_extractors and not ros3_test" -vv -ra
3 changes: 1 addition & 2 deletions .github/workflows/test_containers_docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,6 @@ jobs:
run: |
echo $SPIKEINTERFACE_DEV_PATH
python -c "import os; assert os.getenv('SPIKEINTERFACE_DEV_PATH') is not None"
ls -l
- name: Run test singularity containers
- name: Run test docker containers
run: |
pytest -vv --capture=tee-sys -rA src/spikeinterface/sorters/external/tests/test_docker_containers.py
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 23.3.0
rev: 23.7.0
hooks:
- id: black
files: ^src/
5 changes: 3 additions & 2 deletions doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,7 @@ spikeinterface.preprocessing
.. autofunction:: clip
.. autofunction:: common_reference
.. autofunction:: correct_lsb
.. autofunction:: correct_motion
.. autofunction:: depth_order
.. autofunction:: detect_bad_channels
.. autofunction:: directional_derivative
Expand Down Expand Up @@ -268,13 +269,14 @@ spikeinterface.widgets
.. autofunction:: plot_amplitudes
.. autofunction:: plot_autocorrelograms
.. autofunction:: plot_crosscorrelograms
.. autofunction:: plot_motion
.. autofunction:: plot_quality_metrics
.. autofunction:: plot_sorting_summary
.. autofunction:: plot_spike_locations
.. autofunction:: plot_spikes_on_traces
.. autofunction:: plot_template_metrics
.. autofunction:: plot_template_similarity
.. autofunction:: plot_timeseries
.. autofunction:: plot_traces
.. autofunction:: plot_unit_depths
.. autofunction:: plot_unit_locations
.. autofunction:: plot_unit_summary
Expand All @@ -294,7 +296,6 @@ These widgets are only available with the "matplotlib" backend
.. autofunction:: plot_rasters
.. autofunction:: plot_probe_map
.. autofunction:: plot_isi_distribution
.. autofunction:: plot_drift_over_time
.. autofunction:: plot_peak_activity_map
.. autofunction:: plot_principal_component
.. autofunction:: plot_unit_probe_map
Expand Down
14 changes: 7 additions & 7 deletions doc/how_to/analyse_neuropixels.rst
Original file line number Diff line number Diff line change
Expand Up @@ -264,7 +264,7 @@ the ipywydgets interactive ploter
.. code:: python
%matplotlib widget
si.plot_timeseries({'filter':rec1, 'cmr': rec4}, backend='ipywidgets')
si.plot_traces({'filter':rec1, 'cmr': rec4}, backend='ipywidgets')
Note that using this ipywydgets make possible to explore diffrents
preprocessing chain wihtout to save the entire file to disk. Everything
Expand All @@ -276,9 +276,9 @@ is lazy, so you can change the previsous cell (parameters, step order,
# here we use static plot using matplotlib backend
fig, axs = plt.subplots(ncols=3, figsize=(20, 10))
si.plot_timeseries(rec1, backend='matplotlib', clim=(-50, 50), ax=axs[0])
si.plot_timeseries(rec4, backend='matplotlib', clim=(-50, 50), ax=axs[1])
si.plot_timeseries(rec, backend='matplotlib', clim=(-50, 50), ax=axs[2])
si.plot_traces(rec1, backend='matplotlib', clim=(-50, 50), ax=axs[0])
si.plot_traces(rec4, backend='matplotlib', clim=(-50, 50), ax=axs[1])
si.plot_traces(rec, backend='matplotlib', clim=(-50, 50), ax=axs[2])
for i, label in enumerate(('filter', 'cmr', 'final')):
axs[i].set_title(label)
Expand All @@ -292,7 +292,7 @@ is lazy, so you can change the previsous cell (parameters, step order,
# plot some channels
fig, ax = plt.subplots(figsize=(20, 10))
some_chans = rec.channel_ids[[100, 150, 200, ]]
si.plot_timeseries({'filter':rec1, 'cmr': rec4}, backend='matplotlib', mode='line', ax=ax, channel_ids=some_chans)
si.plot_traces({'filter':rec1, 'cmr': rec4}, backend='matplotlib', mode='line', ax=ax, channel_ids=some_chans)
Expand Down Expand Up @@ -426,7 +426,7 @@ Let’s use here the ``locally_exclusive`` method for detection and the
job_kwargs = dict(n_jobs=40, chunk_duration='1s', progress_bar=True)
peaks = detect_peaks(rec, method='locally_exclusive', noise_levels=noise_levels_int16,
detect_threshold=5, local_radius_um=50., **job_kwargs)
detect_threshold=5, radius_um=50., **job_kwargs)
peaks
Expand All @@ -451,7 +451,7 @@ Let’s use here the ``locally_exclusive`` method for detection and the
from spikeinterface.sortingcomponents.peak_localization import localize_peaks
peak_locations = localize_peaks(rec, peaks, method='center_of_mass', local_radius_um=50., **job_kwargs)
peak_locations = localize_peaks(rec, peaks, method='center_of_mass', radius_um=50., **job_kwargs)
Expand Down
Loading

0 comments on commit 37bb8f6

Please sign in to comment.