Skip to content

Commit

Permalink
Merge branch 'main' into empty-subject-with-multiple-sessions
Browse files Browse the repository at this point in the history
  • Loading branch information
garrettmflynn committed Apr 3, 2024
2 parents bcf39e4 + f67d9c6 commit 3640026
Show file tree
Hide file tree
Showing 61 changed files with 257 additions and 42 deletions.
Binary file modified docs/assets/tutorials/home-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/fail-name.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/formats-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/info-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/inspect-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/intro-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/metadata-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/preview-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/tutorials/multiple/subject-error.png
Binary file added docs/assets/tutorials/multiple/subject-page.png
Binary file added docs/assets/tutorials/multiple/valid-name.png
Binary file added docs/assets/tutorials/multiple/workflow-page.png
Binary file modified docs/assets/tutorials/single/all-interfaces-added.png
Binary file modified docs/assets/tutorials/single/conversion-results-page.png
Binary file modified docs/assets/tutorials/single/fail-name.png
Binary file modified docs/assets/tutorials/single/format-options.png
Binary file modified docs/assets/tutorials/single/formats-page.png
Binary file modified docs/assets/tutorials/single/home-page-complete.png
Binary file modified docs/assets/tutorials/single/info-page.png
Binary file modified docs/assets/tutorials/single/inspect-page.png
Binary file modified docs/assets/tutorials/single/interface-added.png
Binary file modified docs/assets/tutorials/single/intro-page.png
Binary file modified docs/assets/tutorials/single/metadata-ecephys.png
Binary file modified docs/assets/tutorials/single/metadata-nwbfile.png
Binary file modified docs/assets/tutorials/single/metadata-page.png
Binary file modified docs/assets/tutorials/single/metadata-subject-complete.png
Binary file modified docs/assets/tutorials/single/preview-page.png
Binary file modified docs/assets/tutorials/single/search-behavior.png
Binary file modified docs/assets/tutorials/single/sourcedata-page-specified.png
Binary file modified docs/assets/tutorials/single/sourcedata-page.png
Binary file modified docs/assets/tutorials/single/valid-name.png
Binary file modified docs/assets/tutorials/single/workflow-page.png
1 change: 1 addition & 0 deletions docs/conf_extlinks.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
"pynwb-docs": ("https://pynwb.readthedocs.io/en/stable/%s", "%s"),
"matnwb-src": ("https://github.com/NeurodataWithoutBorders/matnwb/%s", "%s"),
"nwb-overview": ("https://nwb-overview.readthedocs.io/en/latest/%s", "%s"),
"path-expansion-guide": ("https://neuroconv.readthedocs.io/en/main/user_guide/expand_path.html%s", "%s"),
"conda-install": (
"https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html#regular-installation%s",
"%s",
Expand Down
124 changes: 123 additions & 1 deletion docs/tutorials/multiple_sessions.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,125 @@
Managing Multiple Sessions
==========================
Coming soon...

Now, let’s say that you’ve already run some of your experiments and now you want to convert them all at the same time. This is where a multi-session workflow will come in handy.

Workflow Configuration
----------------------

On the Workflow page, confirm that this pipeline will be run on multiple sessions. After this, also select that you’d like to locate the source data programmatically and skip dataset publication.

.. figure:: ../assets/tutorials/multiple/workflow-page.png
:align: center
:alt: Workflow page with multiple sessions and locate data selected

Complete the first section of the GUIDE as normal until you reach a new **Locate Data** page after the Data Formats page.

Locate Data
-----------
This page helps you automatically identify source data for multiple subjects / sessions as long as your files are organized consistently.

.. figure:: ../assets/tutorials/multiple/pathexpansion-page.png
:align: center
:alt: Blank path expansion page

File locations are specified as **format strings** that define source data paths of each selected data format.

.. note::
Format strings are one component of NeuroConv's **path expansion language**, which has some nifty features for manually specifying complex paths. Complete documentation of the path expansion feature of NeuroConv can be found :path-expansion-guide:`here <>` .

While you don’t have to specify format strings for all of the pipeline’s data formats, we’re going to find all of our data here for this tutorial. You'll always be able to confirm or manually select the final paths on the Source Data page later in the workflow.

Format strings are specified using two components: the **base directory**, which is the directory to search in, and the **format string path**, where the source data is within that directory.

Given the structure of the tutorial dataset, we’ll select ``~/NWB_GUIDE/test-data/dataset`` as the **base directory**, where **~** is the home directory of your system.

We can take advantage of the **Autocomplete** feature of this page. Instead of manually filling out the format string, click the **Autocomplete** button to open a pop-up form that will derive the format string from a single example path.

.. figure:: ../assets/tutorials/multiple/pathexpansion-autocomplete-open.png
:align: center
:alt: Autocomplete modal on path expansion page

Provide an example source data path (for example, the ``mouse1_Session1_g0_t0.imec0.lf.bin`` file for SpikeGLX), followed by the Subject (``mouse1``) and Session ID (``Session1``) for this particular path.

.. figure:: ../assets/tutorials/multiple/pathexpansion-autocomplete-filled.png
:align: center
:alt: Autocomplete modal completed

When you submit this form, you’ll notice that the Format String Path input has been auto-filled with a pattern for all the sessions.

.. figure:: ../assets/tutorials/multiple/pathexpansion-autocomplete-submitted.png
:align: center
:alt: Path expansion page with autocompleted format string

Repeat this process for Phy, where ``mouse1_Session2_phy`` will be the example source data path.

.. figure:: ../assets/tutorials/multiple/pathexpansion-completed.png
:align: center
:alt: Completed path expansion information

Advance to the next page when you have entered the data locations for both formats.

Subject Metadata
----------------
On this page you’ll edit subject-level metadata across all related sessions. Unlike the previous few pages, you’ll notice that
Sex and Species both have gray asterisks next to their name; this means they are **loose requirements**, which aren’t currently required
but could later block progress if left unspecified.

.. figure:: ../assets/tutorials/multiple/subject-page.png
:align: center
:alt: Blank subject table

In this case, we have two subjects with two sessions each. Let’s say that each of their sessions happened close enough in time that they can be identified using the same **age** entry: ``P29W`` for ``mouse1`` and ``P30W`` for ``mouse2``.

We should also indicate the ``sex`` of each subject since this is a requirement for `uploading to the DANDI Archive <https://www.dandiarchive.org/handbook/135_validation/#missing-dandi-metadata>`_.

.. figure:: ../assets/tutorials/multiple/subject-complete.png
:align: center
:alt: Complete subject table

.. note::
If you're trying to specify metadata that is shared across sessions, you can use the **Global Metadata** feature.

Pressing the Edit Global Metadata button at the top of the page will show a pop-up form which allows you to provide a
single default value for each property, as long as it’s expected not to be unique.

These values will take effect as soon as the pop-up form has been submitted.

While Global Metadata is less relevant when we’re working with two subjeccts, this feature can be very powerful when you’re working with tens or even hundreds of subjects in one conversion.

We recommend using Global Metadata to correct issues caught by the **NWB Inspector** that are seen across several sessions.

You’ll be able to specify Global Metadata on the Source Data and File Metadata pages as well.


Source Data Information
-----------------------
Because we used the Locate Data page to programmatically identify our source data, this page should mostly be complete. You can use this opportunity to verify that the identified paths appear as expected for each session.

.. figure:: ../assets/tutorials/multiple/sourcedata-page.png
:align: center
:alt: Complete source data forms

One notable difference between this and the single-session workflow, however, is that the next few pages will allow you to toggle between sessions using the **session manager** sidebar on the left.

Session Metadata
----------------
Aside from the session manager and global metadata features noted above, the file metadata page in the multi-session workflow is nearly identical to the single-session version.

.. figure:: ../assets/tutorials/multiple/metadata-nwbfile.png
:align: center
:alt: Complete General Metadata form

A complete General Metadata form

Acting as global metadata, the information supplied on the subject metadata page has pre-filled the Subject metadata for each session.

.. figure:: ../assets/tutorials/multiple/metadata-subject-complete.png
:align: center
:alt: Complete Subject metadata form

A complete Subject metadata form

Finish the rest of the workflow as you would for a single session by completing a full conversion after you review the preview files with the NWB Inspector and Neurosift.

Congratulations on completing your first multi-session conversion! You can now convert multiple sessions at once, saving you time and effort.
6 changes: 3 additions & 3 deletions docs/tutorials/single_session.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,18 +98,18 @@ Session Metadata
^^^^^^^^^^^^^^^^
The file metadata page is a great opportunity to add rich annotations to the file, which will be read by anyone reusing your data in the future!

The Session Start Time in the General Metadata section is already specified because this field was automatically extracted from the SpikeGLX source data.
The Session Start Time in the **General Metadata** section is already specified because this field was automatically extracted from the SpikeGLX source data.

.. figure:: ../assets/tutorials/single/metadata-nwbfile.png
:align: center
:alt: Metadata page with invalid Subject information


However, we still need to add the Subject information—as noted by the red accents around that item. Let’s say that our subject is a male mouse with an age of P30D, which represents 30 days old.
However, we still need to add the Subject information—as noted by the red accents around that item. Let’s say that our subject is a male mouse with an age of P25W, which represents 25 weeks old.

.. figure:: ../assets/tutorials/single/metadata-subject-complete.png
:align: center
:alt: Metadata page with valid Subject information
:alt: Metadata page with valid **Subject** information

The status of the Subject information will update in real-time as you fill out the form.

Expand Down
1 change: 0 additions & 1 deletion environments/environment-Windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,3 @@ dependencies:
- pytest == 7.2.2
- pytest-cov == 4.1.0
- scikit-learn == 1.4.0
- scipy == 1.12.0
7 changes: 6 additions & 1 deletion nwb-guide.spec
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,12 @@ from PyInstaller.utils.hooks import collect_all

datas = [('./paths.config.json', '.'), ('./package.json', '.')]
binaries = []
hiddenimports = [ 'email_validator', *collect_submodules('scipy.special.cython_special'), *os.path.join(os.path.dirname(scipy.__file__), '.libs')]
hiddenimports = [
'email_validator',
*collect_submodules('scipy.special.cython_special'),
*collect_submodules('scipy.special._cdflib'),
*os.path.join(os.path.dirname(scipy.__file__), '.libs')
]

datas += collect_data_files('jsonschema_specifications')
tmp_ret = collect_all('dandi')
Expand Down
31 changes: 16 additions & 15 deletions pyflask/manageNeuroconv/manage_neuroconv.py
Original file line number Diff line number Diff line change
Expand Up @@ -1058,6 +1058,7 @@ def generate_test_data(output_path: str):
import spikeinterface
from spikeinterface.extractors import NumpyRecording
from spikeinterface.exporters import export_to_phy
from spikeinterface.preprocessing import scale, bandpass_filter, resample

base_path = Path(output_path)
spikeglx_output_folder = base_path / "spikeglx"
Expand All @@ -1067,33 +1068,29 @@ def generate_test_data(output_path: str):
duration_in_s = 3.0
number_of_units = 50
number_of_channels = 385 # Have to include 'sync' channel to be proper SpikeGLX. TODO: artificiate sync pulses
ap_conversion_factor_to_uV = 2.34375
conversion_factor_to_uV = 2.34375
ap_sampling_frequency = 30_000.0
lf_sampling_frequency = 2_500.0
downsample_factor = int(ap_sampling_frequency / lf_sampling_frequency)

# Generate synthetic spiking and voltage traces with waveforms around them
artificial_ap_band, spiking = spikeinterface.generate_ground_truth_recording(
artificial_ap_band_in_uV, spiking = spikeinterface.generate_ground_truth_recording(
durations=[duration_in_s],
sampling_frequency=ap_sampling_frequency,
num_channels=number_of_channels,
dtype="float32",
num_units=number_of_units,
seed=0, # Fixed seed for reproducibility
)
artificial_ap_band.set_channel_gains(gains=ap_conversion_factor_to_uV)
waveform_extractor = spikeinterface.extract_waveforms(recording=artificial_ap_band, sorting=spiking, mode="memory")
int16_artificial_ap_band = artificial_ap_band.astype(dtype="int16")

# Approximate behavior of LF band with filter and downsampling
# TODO: currently looks a little out of scale?
artificial_lf_filter = spikeinterface.preprocessing.bandpass_filter(
recording=artificial_ap_band, freq_min=10, freq_max=300
)
int16_artificial_lf_band = NumpyRecording(
traces_list=artificial_lf_filter.get_traces()[::downsample_factor],
sampling_frequency=lf_sampling_frequency,
)

unscaled_artificial_ap_band = scale(recording=artificial_ap_band_in_uV, gain=1 / conversion_factor_to_uV)
int16_artificial_ap_band = unscaled_artificial_ap_band.astype(dtype="int16")
int16_artificial_ap_band.set_channel_gains(conversion_factor_to_uV)

unscaled_artificial_lf_filter = bandpass_filter(recording=unscaled_artificial_ap_band, freq_min=0.5, freq_max=1_000)
unscaled_artificial_lf_band = resample(recording=unscaled_artificial_lf_filter, resample_rate=2_500)
int16_artificial_lf_band = unscaled_artificial_lf_band.astype(dtype="int16")
int16_artificial_lf_band.set_channel_gains(conversion_factor_to_uV)

ap_file_path = spikeglx_output_folder / "Session1_g0" / "Session1_g0_imec0" / "Session1_g0_t0.imec0.ap.bin"
ap_meta_file_path = spikeglx_output_folder / "Session1_g0" / "Session1_g0_imec0" / "Session1_g0_t0.imec0.ap.meta"
Expand All @@ -1115,6 +1112,10 @@ def generate_test_data(output_path: str):
io.write(lf_meta_content)

# Make Phy folder
waveform_extractor = spikeinterface.extract_waveforms(
recording=artificial_ap_band_in_uV, sorting=spiking, mode="memory"
)

export_to_phy(
waveform_extractor=waveform_extractor, output_folder=phy_output_folder, remove_if_exists=True, copy_binary=False
)
Expand Down
2 changes: 2 additions & 0 deletions src/renderer/src/stories/Main.js
Original file line number Diff line number Diff line change
Expand Up @@ -190,10 +190,12 @@ export class Main extends LitElement {
}

const headerEl = header ? (this.header = new GuidedHeader(header)) : html`<div></div>`; // Render for grid
if (!header) delete this.header; // Reset header

if (!header) delete this.header; // Reset header

const footerEl = footer ? (this.footer = new GuidedFooter(footer)) : html`<div></div>`; // Render for grid
if (!footer) delete this.footer; // Reset footer

const title = header?.title ?? page.info?.title;

Expand Down
2 changes: 1 addition & 1 deletion src/renderer/src/stories/pages/guided-mode/GuidedFooter.js
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ export class GuidedFooter extends LitElement {
}

updated() {
this.to = (transition) => this.parentElement.to(transition);
this.to = (transition) => this.parentElement.to(transition); // Run main page's transition function
}

render() {
Expand Down
48 changes: 41 additions & 7 deletions tests/e2e/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,13 @@ export const alwaysDelete = [
// ------------------------ Configuration Options ------------------------
// -----------------------------------------------------------------------

const autocompleteOptions = {
subject_id: 'mouse1',
session_id: 'Session1'
}

const { subject_id, session_id } = autocompleteOptions

export const testInterfaceInfo = {
common: {
SpikeGLXRecordingInterface: {
Expand All @@ -46,10 +53,12 @@ export const testInterfaceInfo = {
},
multi: {
SpikeGLXRecordingInterface: {
format: '{subject_id}/{subject_id}_{session_id}/{subject_id}_{session_id}_g0/{subject_id}_{session_id}_g0_imec0/{subject_id}_{session_id}_g0_t0.imec0.ap.bin'
format: '{subject_id}/{subject_id}_{session_id}/{subject_id}_{session_id}_g0/{subject_id}_{session_id}_g0_imec0/{subject_id}_{session_id}_g0_t0.imec0.ap.bin',
autocomplete: {}
},
PhySortingInterface: {
format: '{subject_id}/{subject_id}_{session_id}/{subject_id}_{session_id}_phy'
format: '{subject_id}/{subject_id}_{session_id}/{subject_id}_{session_id}_phy',
autocomplete: {}
}
},
single: {
Expand All @@ -62,14 +71,39 @@ export const testInterfaceInfo = {
}
}

// Add autocomplete options
Object.entries(testInterfaceInfo.multi).forEach(([key, value]) => {
const format = value.format
value.autocomplete = {
path: join(testDatasetPath, format.replace(/{subject_id}/g, subject_id).replace(/{session_id}/g, session_id)),
...autocompleteOptions,
}
})


export const subjectInfo = {
sex: 'M',
species: 'Mus musculus',
age: 'P30D'
common: {
sex: 'M',
species: 'Mus musculus',
},

single: {
age: 'P25W'
},

multiple: {
mouse1: {
age: 'P29W',
sex: 'F'
},
mouse2: {
age: 'P30W'
}
}
}

// export const regenerateTestData = !existsSync(testDataRootPath) || false // Generate only if doesn't exist
export const regenerateTestData = true // Force regeneration
export const regenerateTestData = !existsSync(testDataRootPath) || false // Generate only if doesn't exist
// export const regenerateTestData = true // Force regeneration

export const dandiInfo = {
id: '212750',
Expand Down
2 changes: 1 addition & 1 deletion tests/e2e/e2e.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ describe('E2E Test', () => {
})
})

describe.skip('Complete a multi-session workflow', async () => {
describe('Complete a multi-session workflow', async () => {
const subdirectory = 'multiple'
await runWorkflow('Multi Session Workflow', { upload_to_dandi: false, multiple_sessions: true, locate_data: true }, subdirectory)

Expand Down
Loading

0 comments on commit 3640026

Please sign in to comment.