Skip to content

Commit

Permalink
Merge pull request #731 from NeurodataWithoutBorders/user-tests-updat…
Browse files Browse the repository at this point in the history
…e-posttutorial

04/03/24 User Test Updates
  • Loading branch information
CodyCBakerPhD authored Apr 9, 2024
2 parents 871997b + e9d14a2 commit 3d591ac
Show file tree
Hide file tree
Showing 15 changed files with 50 additions and 31 deletions.
Binary file removed docs/assets/tutorials/multiple/intro-page.png
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/tutorials/multiple/pathexpansion-basepath.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/tutorials/multiple/pathexpansion-completed.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/assets/tutorials/single/intro-page.png
Binary file not shown.
4 changes: 2 additions & 2 deletions docs/tutorials/dataset.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,15 @@ To get you started as quickly as possible, we’ve created a way to generate thi

The **Phy** data format stores spike sorting results.

Navigate to the **Settings** page using the main sidebar. Then press the **Generate** button in the top-right corner to initiate the dataset creation.
Navigate to the **Settings** page using the button at the bottom of the main sidebar. Then press the **Generate** button in the top-right corner to initiate the dataset creation.

.. figure:: ../assets/tutorials/dataset-creation.png
:align: center
:alt: Dataset Creation Screen

Press the Generate button on the Settings page to create the dataset.

The generated data will populate in the ``~/NWB_GUIDE/test_data`` directory and include a ``data`` folder with the original data as well as a ``dataset`` folder that duplicates this ``data`` across multiple subjects and sessions.
The generated data will populate in the ``~/NWB_GUIDE/test_data`` directory, where ``~`` is the home directory of your system. This includes a ``data`` folder with the original data as well as a ``dataset`` folder that duplicates this ``data`` across multiple subjects and sessions.

.. code-block:: bash
Expand Down
7 changes: 5 additions & 2 deletions docs/tutorials/index.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,15 @@
Tutorials
=======================================
The NWB Graphical User Interface for Data Entry (GUIDE) is a desktop tool for converting neurophysiological data
to the Neurodata Without Borders (NWB) format and uploading to the DANDI Archive. In these tutorials, we detail this
process from initial setup to final upload.
to the Neurodata Without Borders (NWB) format and uploading to the DANDI Archive.

In these tutorials, you'll follow along on a :doc:`local installation of the GUIDE </installation>` as we detail the conversion process from initial setup to final upload.

Before you begin these tutorials, **you'll need to generate the tutorial dataset** using the instructions on the Dataset page.




.. toctree::
:maxdepth: 2

Expand Down
7 changes: 6 additions & 1 deletion docs/tutorials/multiple_sessions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ Managing Multiple Sessions

Now, let’s say that you’ve already run some of your experiments and now you want to convert them all at the same time. This is where a multi-session workflow will come in handy.

Begin a new conversion on the **Convert** page and provide a name for your pipeline.

Workflow Configuration
----------------------

Expand All @@ -12,7 +14,9 @@ On the Workflow page, confirm that this pipeline will be run on multiple session
:align: center
:alt: Workflow page with multiple sessions and locate data selected

Complete the first section of the GUIDE as normal until you reach a new **Locate Data** page after the Data Formats page.
Data Formats
------------
As before, specify **SpikeGLX Recording** and **Phy Sorting** as the data formats for this conversion.

Locate Data
-----------
Expand Down Expand Up @@ -91,6 +95,7 @@ We should also indicate the ``sex`` of each subject since this is a requirement

You’ll be able to specify Global Metadata on the Source Data and File Metadata pages as well.

Advance to the next page when you have entered subject metadata for all subjects.

Source Data Information
-----------------------
Expand Down
8 changes: 5 additions & 3 deletions docs/tutorials/single_session.rst
Original file line number Diff line number Diff line change
Expand Up @@ -83,16 +83,17 @@ Data Entry

Source Data Information
^^^^^^^^^^^^^^^^^^^^^^^
On this page, specify the relevant **.bin** (Spikeglx) file and **phy** folder so that the GUIDE can find this source data to complete the conversion.
On this page, specify the **.ap.bin** (SpikeGLX) file and **phy** folder so that the GUIDE can find this source data to complete the conversion.

As discussed in the :doc:`Dataset Generation </tutorials/dataset>` tutorial, these can be found in the ``~/NWB_GUIDE/test-data/data`` directory.
As discussed in the :doc:`Dataset Generation </tutorials/dataset>` tutorial, these can be found in the ``~/NWB_GUIDE/test-data/data`` directory, where **~** is the home directory of your system.

You can either click the file selector to navigate to the file or drag-and-drop into the GUIDE from your file navigator.

.. figure:: ../assets/tutorials/single/sourcedata-page-specified.png
:align: center
:alt: Source Data page with source locations specified

Advance to the next page to extract metadata from the source data.

Session Metadata
^^^^^^^^^^^^^^^^
Expand All @@ -104,6 +105,7 @@ The Session Start Time in the **General Metadata** section is already specified
:align: center
:alt: Metadata page with invalid Subject information

While the **General Metadata** section is complete, take some time to fill out additional information such as the **Institutional Info** box and the **Experimenter** field.

However, we still need to add the Subject information—as noted by the red accents around that item. Let’s say that our subject is a male mouse with an age of P25W, which represents 25 weeks old.

Expand Down Expand Up @@ -137,7 +139,7 @@ The Inspector Report page allows you to validate the preview file against the la
:align: center
:alt: NWB Inspector report


Advance to the next page when you are satisfied with the Inspector Report.

Conversion Preview
^^^^^^^^^^^^^^^^^^
Expand Down
5 changes: 0 additions & 5 deletions src/renderer/src/pages.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ import { GettingStartedPage } from "./stories/pages/getting-started/GettingStart
import { DocumentationPage } from "./stories/pages/documentation/Documentation";
import { ContactPage } from "./stories/pages/contact-us/Contact";
import { GuidedHomePage } from "./stories/pages/guided-mode/GuidedHome";
import { GuidedStartPage } from "./stories/pages/guided-mode/GuidedStart";
import { GuidedNewDatasetPage } from "./stories/pages/guided-mode/setup/GuidedNewDatasetInfo";
import { GuidedStructurePage } from "./stories/pages/guided-mode/data/GuidedStructure";
import { sections } from "./stories/pages/globals";
Expand Down Expand Up @@ -85,10 +84,6 @@ const pages = {
label: "Convert",
icon: guidedIcon,
pages: {
start: new GuidedStartPage({
label: "Start",
}),

details: new GuidedNewDatasetPage({
title: "Project Setup",
label: "Project details",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,7 @@ export class GuidedPathExpansionPage extends Page {
workflow = {
subject_id: {},
session_id: {},
base_directory: {},
locate_data: {
skip: () => {
this.#initialize();
Expand Down Expand Up @@ -432,9 +433,15 @@ export class GuidedPathExpansionPage extends Page {
// Require properties for all sources
const generatedSchema = { type: "object", properties: {}, additionalProperties: false };
const controls = {};

const baseDirectory = this.workflow.base_directory.value;
const globals = (structureState.globals = {});

for (let key in this.info.globalState.interfaces) {
generatedSchema.properties[key] = { type: "object", ...pathExpansionSchema };

if (baseDirectory) globals[key] = { base_directory: baseDirectory };

controls[key] = {
format_string_path: [
new Button({
Expand All @@ -450,8 +457,6 @@ export class GuidedPathExpansionPage extends Page {
}
structureState.schema = generatedSchema;

// this.optional.requestUpdate();

const form = (this.form = new JSONSchemaForm({
...structureState,
onThrow,
Expand Down
13 changes: 13 additions & 0 deletions src/renderer/src/stories/pages/guided-mode/setup/Preform.js
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,18 @@ const questions = {
},
default: false,
},

base_directory: {
type: "string",
format: "directory",
title: "Where is your data located?",
description:
"A single directory where all data is contained. Can override for specific data formats.<br><small>Leave blank if unknown</small>",
dependencies: {
locate_data: {},
},
},

upload_to_dandi: {
type: "boolean",
title: "Would you like to upload your data to DANDI?",
Expand Down Expand Up @@ -81,6 +93,7 @@ const projectWorkflowSchema = {
return acc;
}, {}),
order: Object.keys(questions),
additionalProperties: false,
};

// ----------------------------------------------------------------------
Expand Down
28 changes: 12 additions & 16 deletions tests/e2e/workflow.ts
Original file line number Diff line number Diff line change
Expand Up @@ -88,11 +88,6 @@ export default async function runWorkflow(name, workflow, identifier) {

test('Create new pipeline by specifying a name', async () => {

// Advance to instructions page
await toNextPage('start')

await takeScreenshot(join(identifier, 'intro-page'), 500)

// Advance to general information page
await toNextPage('details')

Expand Down Expand Up @@ -209,26 +204,27 @@ export default async function runWorkflow(name, workflow, identifier) {
baseInput.updateData(basePath)
})

dashboard.main.querySelector('main > section').scrollTop = 200 // Scroll down to see all interfaces

},
testInterfaceInfo,
testDatasetPath
)

await takeScreenshot(join(identifier, 'pathexpansion-basepath'), 300)

const name = Object.keys(testInterfaceInfo.common)[0]
const interfaceId = testInterfaceInfo.common[name].id
const autocompleteInfo = testInterfaceInfo.multi[name].autocomplete

await evaluate(id => {
const interfaceId = await evaluate(() => {
const dashboard = document.querySelector('nwb-dashboard')
const form = dashboard.page.form
const id = Object.keys(form.accordions)[0]
const formatInput = form.getFormElement([id, 'format_string_path'])
const autocompleteButton = formatInput.controls[0]
autocompleteButton.onClick()
}, interfaceId)
return id
})

// Use autocomplete on first interface
const name = Object.entries(testInterfaceInfo.common).find(([name, info]) => info.id === interfaceId)![0]
const autocompleteInfo = testInterfaceInfo.multi[name].autocomplete

await takeScreenshot(join(identifier, 'pathexpansion-autocomplete-open'), 300)

Expand Down Expand Up @@ -258,16 +254,16 @@ export default async function runWorkflow(name, workflow, identifier) {
const dashboard = document.querySelector('nwb-dashboard')
const form = dashboard.page.form

const accordionKeys = Object.keys(form.accordions)

// Fill out the path expansion information for non-autocompleted interfaces
Object.entries(common).slice(1).forEach(([ name, info ]) => {
const id = info.id
accordionKeys.slice(1).forEach(id => {
const name = Object.entries(common).find(([_, info]) => info.id === id)![0]
const { format } = multi[name]
const formatInput = form.getFormElement([id, 'format_string_path'])
formatInput.updateData(format)
})

dashboard.main.querySelector('main > section').scrollTop = 200

}, testInterfaceInfo)


Expand Down

0 comments on commit 3d591ac

Please sign in to comment.