Skip to content

Latest commit

 

History

History
121 lines (79 loc) · 5.02 KB

README.md

File metadata and controls

121 lines (79 loc) · 5.02 KB

Neuro-CNN: Convolutional Neural Networks for neural localization and classification

Neuro-CNN is a project for using machine learning, in particular Convolutional Neural Networks (CNN), to localize neurons in 3d and to classify their neural types from the Extracellular Action Potential (EAP) on Multi-Electrode Arrays (MEA).

To clone this repo open your terminal and run:

git clone https://github.com/CINPLA/NeuroCNN.git

Pre-requisites

Neuro-CNN runs on Python2 and Python3. In order to install all required packages we recommend creating an anaconda (https://www.anaconda.com/download/) environment using the environment files. Open your terminal and run:

For Anaconda2 conda env create -f environment.yml

For Anaconda3 conda env create -f environment3.yml

Then activate the environment:

On Linux/MacOS: source activate neurocnn

On Windows: activate neurocnn

The neural simulations rely on NEURON (https://www.neuron.yale.edu/neuron/) and the latest version of LFPy. Once NEURON is installed, you can install LFPy (all requirements are installed in the environment):

git clone https://github.com/LFPy/LFPy
cd LFPy
python setup.py install

Cell simulations:

Cell models can be downloaded from the Neocortical Micro Circuit Portal https://bbp.epfl.ch/nmc-portal/welcome (13 models from layer 5 for testing are already included). Newly downloaded models should be unzipped to the folder cell_models/bbp/

First, you must run python hbp_cells.py compile to compile .mod files. This only has to be done once, if you do not add more models or manipulate the .mod files (if you find problems in compiling try to install: sudo apt-get install lib32ncurses5-dev)

To run all L5 cell models you have downloaded, run python do_all_cell_simulations.py From the command line you can give arguments to customize your simulated data:

  • -model -- cell model (corresponding to the subfolder in 'cell_models' (default 'bbp')
  • -rot -- 3d rotations (Norot-physrot-3drot) to apply to the models before computing EAPs (default: 'physrot')
  • -intraonly -- only simulate intracellular dynamics (leave EAP simulation for later)
  • -probe -- choose the probe type from the list in electrodes folder*
  • -n -- number of observations per cell type

*or create your own! All you have to do is create a .json file with some specs and add you probe in the get_elcoords() function of the MEAutility file

Your EAPs will be stored in the <data_dir>/spikes/bbp/<rotation_type>/ folder (where <data_dir> is defined in defaultconfig.py and can be customized with a local config.py file) and the folder containing the spikes is named: e_<n-obs-per-spike>_1px_<y-pitch>_<z-pitch>_<MEAname>_<date>

Localization and classification with CNNs (Tensorflow):

After EAPs are simulated you can train CNNs for localization and classification.

To run localization change to the localization directory and run:

python conv_localization.py -f <path-to-spikes-folder>

You can give command line arguments to customize the network:

  • -f -- path-to-spikes-folder
  • -feat -- feature type: Na - Rep - NaRep (default) - 3d (Waveform)
  • -val -- validation: holdout (default) - hold-model-out
  • -n -- number of training steps
  • -cn -- cell model names to use
  • -tcn -- train cellnames file*
  • -vcn -- validation cellnames file*
  • -s -- CNN size: xs - s - m - l (default) - xl
  • -modelout -- model to hold out (in case validation is hold-model-out)
  • -seed -- random seed to shuffle the data

To run classification change to the classification directory and run:

python conv_classification.py -f <path-to-spikes-folder>

The command line arguments are the same as localization except:

  • -feat -- feature type: AW - FW (default) - AFW - 3d (Waveform)
  • -cl -- classification type: binary (excitatory-inhibitory - default) - m-type

*when -tcn and -vcn arguments are used, the CNN is trained on the models in the -tcn file and validated on the models in the -vcn one (we recommend validating on cell models not used for training to test for overfitting). To reduce the overfitting phenomenon, try to train on a large dataset consisting of a variety of cell models!

The models are stored in <data_dir>/localization(classification)/models/ and named: model_<rotation>_<feat-type>_<val-type>_<size>_<spike-folder>_<date>

Using the models

To reload the models and localize (or classify) another dataset you can use the predict_location.py (or predict_classification.py) scripts.

python predict_location.py -mod <CNN-model-folder> -val <spikes-folder>

The spikes-folder is a folder generated with do_all_cell_simulations.py and the CNN-model-folder is the model generated by conv_localization.py/conv_classification.py

References

For details please take a look at our paper: "Combining forward modeling and deep learning for neural localization and classification", ...

Contact us

If you have problems running the code don't hesitate to contact us, or write an issue on the github page.

Alessio Buccino - [email protected]

Michael Kordovan - [email protected]