-
Notifications
You must be signed in to change notification settings - Fork 190
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' of github.com:SpikeInterface/spikeinterface into …
…rel-path-phy
- Loading branch information
Showing
45 changed files
with
717 additions
and
191 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -7,3 +7,4 @@ How to guides | |
get_started | ||
analyse_neuropixels | ||
handle_drift | ||
load_matlab_data |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,100 @@ | ||
Exporting MATLAB Data to Binary & Loading in SpikeInterface | ||
=========================================================== | ||
|
||
In this tutorial, we will walk through the process of exporting data from MATLAB in a binary format and subsequently loading it using SpikeInterface in Python. | ||
|
||
Exporting Data from MATLAB | ||
-------------------------- | ||
|
||
Begin by ensuring your data structure is correct. Organize your data matrix so that the first dimension corresponds to samples/time and the second to channels. | ||
Here, we present a MATLAB code that creates a random dataset and writes it to a binary file as an illustration. | ||
|
||
.. code-block:: matlab | ||
% Define the size of your data | ||
numSamples = 1000; | ||
numChannels = 384; | ||
% Generate random data as an example | ||
data = rand(numSamples, numChannels); | ||
% Write the data to a binary file | ||
fileID = fopen('your_data_as_a_binary.bin', 'wb'); | ||
fwrite(fileID, data, 'double'); | ||
fclose(fileID); | ||
.. note:: | ||
|
||
In your own script, replace the random data generation with your actual dataset. | ||
|
||
Loading Data in SpikeInterface | ||
------------------------------ | ||
|
||
After executing the above MATLAB code, a binary file named `your_data_as_a_binary.bin` will be created in your MATLAB directory. To load this file in Python, you'll need its full path. | ||
|
||
Use the following Python script to load the binary data into SpikeInterface: | ||
|
||
.. code-block:: python | ||
import spikeinterface as si | ||
from pathlib import Path | ||
# Define file path | ||
# For Linux or macOS: | ||
file_path = Path("/The/Path/To/Your/Data/your_data_as_a_binary.bin") | ||
# For Windows: | ||
# file_path = Path(r"c:\path\to\your\data\your_data_as_a_binary.bin") | ||
# Confirm file existence | ||
assert file_path.is_file(), f"Error: {file_path} is not a valid file. Please check the path." | ||
# Define recording parameters | ||
sampling_frequency = 30_000.0 # Adjust according to your MATLAB dataset | ||
num_channels = 384 # Adjust according to your MATLAB dataset | ||
dtype = "float64" # MATLAB's double corresponds to Python's float64 | ||
# Load data using SpikeInterface | ||
recording = si.read_binary(file_path, sampling_frequency=sampling_frequency, | ||
num_channels=num_channels, dtype=dtype) | ||
# Confirm that the data was loaded correctly by comparing the data shapes and see they match the MATLAB data | ||
print(recording.get_num_frames(), recording.get_num_channels()) | ||
Follow the steps above to seamlessly import your MATLAB data into SpikeInterface. Once loaded, you can harness the full power of SpikeInterface for data processing, including filtering, spike sorting, and more. | ||
|
||
Common Pitfalls & Tips | ||
---------------------- | ||
|
||
1. **Data Shape**: Make sure your MATLAB data matrix's first dimension is samples/time and the second is channels. If your time is in the second dimension, use `time_axis=1` in `si.read_binary()`. | ||
2. **File Path**: Always double-check the Python file path. | ||
3. **Data Type Consistency**: Ensure data types between MATLAB and Python are consistent. MATLAB's `double` is equivalent to Numpy's `float64`. | ||
4. **Sampling Frequency**: Set the appropriate sampling frequency in Hz for SpikeInterface. | ||
5. **Transition to Python**: Moving from MATLAB to Python can be challenging. For newcomers to Python, consider reviewing numpy's [Numpy for MATLAB Users](https://numpy.org/doc/stable/user/numpy-for-matlab-users.html) guide. | ||
|
||
Using gains and offsets for integer data | ||
---------------------------------------- | ||
|
||
Raw data formats often store data as integer values for memory efficiency. To give these integers meaningful physical units, you can apply a gain and an offset. | ||
In SpikeInterface, you can use the `gain_to_uV` and `offset_to_uV` parameters, since traces are handled in microvolts (uV). Both parameters can be integrated into the `read_binary` function. | ||
If your data in MATLAB is stored as `int16`, and you know the gain and offset, you can use the following code to load the data: | ||
|
||
.. code-block:: python | ||
sampling_frequency = 30_000.0 # Adjust according to your MATLAB dataset | ||
num_channels = 384 # Adjust according to your MATLAB dataset | ||
dtype_int = 'int16' # Adjust according to your MATLAB dataset | ||
gain_to_uV = 0.195 # Adjust according to your MATLAB dataset | ||
offset_to_uV = 0 # Adjust according to your MATLAB dataset | ||
recording = si.read_binary(file_path, sampling_frequency=sampling_frequency, | ||
num_channels=num_channels, dtype=dtype_int, | ||
gain_to_uV=gain_to_uV, offset_to_uV=offset_to_uV) | ||
recording.get_traces(return_scaled=True) # Return traces in micro volts (uV) | ||
This will equip your recording object with capabilities to convert the data to float values in uV using the :code:`get_traces()` method with the :code:`return_scaled` parameter set to :code:`True`. | ||
|
||
.. note:: | ||
|
||
The gain and offset parameters are usually format dependent and you will need to find out the correct values for your data format. You can load your data without gain and offset but then the traces will be in integer values and not in uV. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
Amplitude CV (:code:`amplitude_cv_median`, :code:`amplitude_cv_range`) | ||
====================================================================== | ||
|
||
|
||
Calculation | ||
----------- | ||
|
||
The amplitude CV (coefficient of variation) is a measure of the amplitude variability. | ||
It is computed as the ratio between the standard deviation and the amplitude mean. | ||
To obtain a better estimate of this measure, it is first computed separately for several temporal bins. | ||
Out of these values, the median and the range (percentile distance, by default between the | ||
5th and 95th percentiles) are computed. | ||
|
||
The computation requires either spike amplitudes (see :py:func:`~spikeinterface.postprocessing.compute_spike_amplitudes()`) | ||
or amplitude scalings (see :py:func:`~spikeinterface.postprocessing.compute_amplitude_scalings()`) to be pre-computed. | ||
|
||
|
||
Expectation and use | ||
------------------- | ||
|
||
The amplitude CV median is expected to be relatively low for well-isolated units, indicating a "stereotypical" spike shape. | ||
|
||
The amplitude CV range can be high in the presence of noise contamination, due to amplitude outliers like in | ||
the example below. | ||
|
||
.. image:: amplitudes.png | ||
:width: 600 | ||
|
||
|
||
Example code | ||
------------ | ||
|
||
.. code-block:: python | ||
import spikeinterface.qualitymetrics as sqm | ||
# Make recording, sorting and wvf_extractor object for your data. | ||
# It is required to run `compute_spike_amplitudes(wvf_extractor)` or | ||
# `compute_amplitude_scalings(wvf_extractor)` (if missing, values will be NaN) | ||
amplitude_cv_median, amplitude_cv_range = sqm.compute_amplitude_cv_metrics(wvf_extractor) | ||
# amplitude_cv_median and amplitude_cv_range are dicts containing the unit ids as keys, | ||
# and their amplitude_cv metrics as values. | ||
References | ||
---------- | ||
|
||
.. autofunction:: spikeinterface.qualitymetrics.misc_metrics.compute_amplitude_cv_metrics | ||
|
||
|
||
Literature | ||
---------- | ||
|
||
Designed by Simon Musall and adapted to SpikeInterface by Alessio Buccino. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,40 @@ | ||
Firing range (:code:`firing_range`) | ||
=================================== | ||
|
||
|
||
Calculation | ||
----------- | ||
|
||
The firing range indicates the dispersion of the firing rate of a unit across the recording. It is computed by | ||
taking the difference between the 95th percentile's firing rate and the 5th percentile's firing rate computed over short time bins (e.g. 10 s). | ||
|
||
|
||
|
||
Expectation and use | ||
------------------- | ||
|
||
Very high levels of firing ranges, outside of a physiological range, might indicate noise contamination. | ||
|
||
|
||
Example code | ||
------------ | ||
|
||
.. code-block:: python | ||
import spikeinterface.qualitymetrics as sqm | ||
# Make recording, sorting and wvf_extractor object for your data. | ||
firing_range = sqm.compute_firing_ranges(wvf_extractor) | ||
# firing_range is a dict containing the unit IDs as keys, | ||
# and their firing firing_range as values (in Hz). | ||
References | ||
---------- | ||
|
||
.. autofunction:: spikeinterface.qualitymetrics.misc_metrics.compute_firing_ranges | ||
|
||
|
||
Literature | ||
---------- | ||
|
||
Designed by Simon Musall and adapted to SpikeInterface by Alessio Buccino. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.