This is a collection of tooling and resources for spikes - including spike sorting and analyses of single- and multi-unit data.
To contribute a new link to a tool or resource, open an issue mentioning it, or a pull request with a link.
Note that single-unit datasets are listed in the OpenData list.
Single neuron data is recorded with multiple different amplifiers and recording systems from different companinies, often with different proprietary file formats. This section collects tools and resources related to data management for neural recordings and single-unit data.
NWB is a standardized data schema that can be used to store single-unit data.
Within the NWB ecosystem, there are the following associated tools:
- pynwb, a Python interface for NWB files
- matNWB, a Matlab interface for NWB files
- NWB conversion tools for converting and combining data from proprietary formats
- NWB widgets for visualizing data in NWB files
Other tools related to data management for single-unit data include:
- NEO, a Python tool for reading neurophysiology file formats
Spike sorting is the process of the grouping spikes into clusters, that relate to putative individual neurons, for which there have been many proposed algorithms. The following resources relate specifically to spike sorting.
Spike sorting tutorials:
- A brief overview of spike sorting from the Simons Foundation
- A tutorial from cambridge neurotech
- A tutorial from CBBM
- A collection of tutorials from Spike Interface
SpikeInterface is a collection of Python modules relating to spike sorting and associated processes.
The SpikeInterface ecosystem includes multiple related tool, including:
- spikeextractors provides interfaces for data files
- spiketoolkit provides modules pre- & post-processing related to spike sorting
- spikesorters provides an interface for running supported spike sorters
- spikecomparison provides tools for comparing and benchmarking sorting outputs
- spikewidgets provides widgets for visualizations
- spikemetrics provides metrics for spike sorting quality control
SpikeInterface includes the access to multiple algorithms that can be used through the tool, including this list of spike sorters.
Spike sorters that can be used on microwires, such as in human single-neuron data:
Spike sorters that can be used on high density recordings, such as in animal recordings:
Spike sorting solutions can vary, and in general requires quality control procedures to ensure that sorting solution are robust and adequately reflect isolated units. Different spike sorters can give different solutions, and it may be useful to compare different spike sorters to each other.
The following are systematic comparisons of different spike sorters:
- SpikeForrest is a project that systematically compares multiple spike sorting algorithms.
The following are guides to quality control for spike sorting:
- A quality metrics guide from the Allen institute
- Notes on spike sorting metrics from Ed Merricks
The following collects tools and resources for analyzing (sorted) single neuron data.
General Tools:
- Spykes is a Python toolbox for spike data analysis & visualizations
- elephant, which uses neo
- Phy is a Python tool providing a graphical interface for visualization and manual curation of large-scale electrophysiology data
The following are tools & utitilies for simulations of spiking data:
- MEArec Python toolbox for simulating extra-cellular recordings on multi-unit arrays
The following are dedicated tutorials for single-cell analyses:
The following are code repositories for individual analyses / projects / papers:
- Replay trajectory classification code
- Code for ictal recruitment in human single-unit activity
- Matlab code for a project on egocentric boundary cells
The following are collections of code available from particular labs:
- A collection of code (Matlab) from the Buzsaki lab
- Code repositories (mostly Matlab) from the Giocomo Lab
- Code repositories from the Frank Lab