Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export of group-level tables for aparc- and aseg-stats #26

Merged
merged 8 commits into from
Feb 13, 2017
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,11 @@ ENV PATH /opt/freesurfer/bin:/opt/freesurfer/fsfast/bin:/opt/freesurfer/tktools:
ENV PYTHONPATH=""
RUN echo "cHJpbnRmICJrcnp5c3p0b2YuZ29yZ29sZXdza2lAZ21haWwuY29tXG41MTcyXG4gKkN2dW12RVYzelRmZ1xuRlM1Si8yYzFhZ2c0RVxuIiA+IC9vcHQvZnJlZXN1cmZlci9saWNlbnNlLnR4dAo=" | base64 -d | sh

# make freesurfer python scripts python3 ready
RUN 2to3-3.4 -w $FREESURFER_HOME/bin/aparcstats2table
RUN 2to3-3.4 -w $FREESURFER_HOME/bin/asegstats2table
RUN 2to3-3.4 -w $FREESURFER_HOME/bin/*.py
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the other option would be to install python2. is that you preference?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only problem of this option is that it breaks the possibility to run the script outside of docker.
People without root privileges cannot run docker so it is a problem for them.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why would you want to run python scripts from inside the container using python interpreter from outside the container (host). How are you using this App without Docker? Are you using Singularity?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if it is intended for but I just run the app by calling run.py from a server with freesurfer installed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I see. It's intended to be run via Docker or Singualrity. The latter works on clusters/hpc without root. You should check it out: http://singularity.lbl.gov/ http://bids-apps.neuroimaging.io/tutorial/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK but my cluster does not have singularity installed and if I install it myself with my user privileges I get this error when I try to execute the image:

-bash-4.1$ ./bids_example-2016-10-27-709845fcdcd0.img 
ERROR  : Singularity must be executed in privileged mode to use images
ABORT  : Retval = 255

Anyway it is not a big problem for me because I can easily modify this part of the script myself since everything else is working fine without the need to be in a container.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I recommend contacting your admins and asking them to install singularity system wide. It will make both of your life's easier!


RUN mkdir /scratch
RUN mkdir /local-scratch

Expand Down
167 changes: 100 additions & 67 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,71 +22,89 @@ https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation
This App has the following command line arguments:

$ docker run -ti --rm bids/freesurfer --help
usage: run.py [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--template_name TEMPLATE_NAME]
bids_dir output_dir {participant,group}

FreeSurfer recon-all + custom template generation.

NOTE: if scripts/IsRunning is present, this pipeline assumes recon-all was
interrupted and removes the directory then re-runs the processing stream.

positional arguments:
bids_dir The directory with the input dataset formatted
according to the BIDS standard.
output_dir The directory where the output files should be stored.
If you are running group level analysis this folder
should be prepopulated with the results of
theparticipant level analysis.
{participant,group} Level of the analysis that will be performed. Multiple
participant level analyses can be run independently
(in parallel) using the same output_dir.
required arguments:
--license_key LICENSE_KEY
FreeSurfer license key - letters and numbers after "*"
in the email you received after registration. To
register (for free) visit
https://surfer.nmr.mgh.harvard.edu/registration.html

optional arguments:
-h, --help show this help message and exit
--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]
The label of the participant that should be analyzed.
The label corresponds to sub-<participant_label> from
the BIDS spec (so it does not include "sub-"). If this
parameter is not provided all subjects should be
analyzed. Multiple participants can be specified with
a space separated list.
--n_cpus N_CPUS Number of CPUs/cores available to use. (Default is 1)
--stages {all,autorecon1,autorecon2,autorecon2-cp,autorecon2-wm,autorecon2-pial,autorecon3,autorecon-all}
Recon-all stages to run. (Default is autorecon-all)
--template_name TEMPLATE_NAME
Name for the custom group level template generated
for this dataset.
--acquisition_label ACQUISITION_LABEL
If the dataset contains multiple T1 weighted images
from different acquisitions which one should be used?
Corresponds to "acq-<acquisition_label>"
--multiple_sessions {longitudinal, multiday}
For datasets with multiday sessions where you do not
want to use the longitudinal pipeline, i.e., sessions
were back-to-back, set this to multiday, otherwise
sessions with T1w data will be considered independent
sessions for longitudinal analysis.
--refine_pial {T2,FLAIR,None,T1only}
If the dataset contains 3D T2 or T2 FLAIR weighted
images (~1x1x1), these can be used to refine the pial
surface. The current default is to look for
appropriate T2s, then look for appropriate FLAIRs
(resolution <1.2mm isovolumetric). If you want to
ignore these, specify None or T1only to generate
surfaces on the T1 alone.
--hires_mode {auto,enable,disable}
Submilimiter (high resolution) processing. 'auto' -
use only if <1.0mm data detected, 'enable' - force on,
'disable' - force off
usage: run.py [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--n_cpus N_CPUS]
[--stages {autorecon1,autorecon2,autorecon2-cp,autorecon2-wm,autorecon2-pial,autorecon3,autorecon-all,all}
[{autorecon1,autorecon2,autorecon2-cp,autorecon2-wm,autorecon2-pial,autorecon3,autorecon-all,all} ...]]
[--template_name TEMPLATE_NAME] --license_key LICENSE_KEY
[--acquisition_label ACQUISITION_LABEL]
[--multiple_sessions {longitudinal,multiday}]
[--refine_pial {T2,FLAIR,None,T1only}]
[--hires_mode {auto,enable,disable}]
[--parcellations {aparc,aparc.a2009s} [{aparc,aparc.a2009s} ...]]
[--measurements {area,volume,thickness,thicknessstd,meancurv,gauscurv,foldind,curvind}
[{area,volume,thickness,thicknessstd,meancurv,gauscurv,foldind,curvind} ...]]
[-v]
bids_dir output_dir {participant,group1,group2}

FreeSurfer recon-all + custom template generation.

positional arguments:
bids_dir The directory with the input dataset formatted
according to the BIDS standard.
output_dir The directory where the output files should be stored.
If you are running group level analysis this folder
should be prepopulated with the results of
theparticipant level analysis.
{participant,group1,group2}
Level of the analysis that will be performed. Multiple
participant level analyses can be run independently
(in parallel) using the same output_dir. "goup1"
creates study specific group template. "group2 exports
group stats tables for cortical parcellation and
subcortical segmentation.

optional arguments:
-h, --help show this help message and exit
--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]
The label of the participant that should be analyzed.
The label corresponds to sub-<participant_label> from
the BIDS spec (so it does not include "sub-"). If this
parameter is not provided all subjects should be
analyzed. Multiple participants can be specified with
a space separated list.
--n_cpus N_CPUS Number of CPUs/cores available to use.
--stages {autorecon1,autorecon2,autorecon2-cp,autorecon2-wm,autorecon2-pial,autorecon3,autorecon-all,all}
[{autorecon1,autorecon2,autorecon2-cp,autorecon2-wm,autorecon2-pial,autorecon3,autorecon-all,all} ...]
Autorecon stages to run.
--template_name TEMPLATE_NAME
Name for the custom group level template generated for
this dataset
--license_key LICENSE_KEY
FreeSurfer license key - letters and numbers after "*"
in the email you received after registration. To
register (for free) visit
https://surfer.nmr.mgh.harvard.edu/registration.html
--acquisition_label ACQUISITION_LABEL
If the dataset contains multiple T1 weighted images
from different acquisitions which one should be used?
Corresponds to "acq-<acquisition_label>"
--multiple_sessions {longitudinal,multiday}
For datasets with multiday sessions where you do not
want to use the longitudinal pipeline, i.e., sessions
were back-to-back, set this to multiday, otherwise
sessions with T1w data will be considered independent
sessions for longitudinal analysis.
--refine_pial {T2,FLAIR,None,T1only}
If the dataset contains 3D T2 or T2 FLAIR weighted
images (~1x1x1), these can be used to refine the pial
surface. If you want to ignore these, specify None or
T1only to base surfaces on the T1 alone.
--hires_mode {auto,enable,disable}
Submilimiter (high resolution) processing. 'auto' -
use only if <1.0mm data detected, 'enable' - force on,
'disable' - force off
--parcellations {aparc,aparc.a2009s} [{aparc,aparc.a2009s} ...]
Group2 option: cortical parcellation(s) to extract
stats from.
--measurements {area,volume,thickness,thicknessstd,meancurv,gauscurv,foldind,curvind}
[{area,volume,thickness,thicknessstd,meancurv,gauscurv,foldind,curvind} ...]
Group2 option: cortical measurements to extract stats for.
-v, --version show program's version number and exit


#### Participant level
To run it in participant level mode (for one participant):

docker run -ti --rm \
Expand All @@ -96,12 +114,27 @@ To run it in participant level mode (for one participant):
/bids_dataset /outputs participant --participant_label 01 \
--license_key "XXXXXXXX"

After doing this for all subjects (potentially in parallel) the group level analysis
can be run:

#### Group level
After doing this for all subjects (potentially in parallel) the
group level analyses can be run.

To create a study specific template run:

docker run -ti --rm \
-v /Users/filo/data/ds005:/bids_dataset:ro \
-v /Users/filo/outputs:/outputs \
bids/freesurfer \
/bids_dataset /outputs group1 \
--license_key "XXXXXXXX"

To export tables with aggregated measurements within regions of
cortical parcellation and subcortical segementation run:

docker run -ti --rm \
-v /Users/filo/data/ds005:/bids_dataset:ro \
-v /Users/filo/outputs:/outputs \
bids/freesurfer \
/bids_dataset /outputs group \
/bids_dataset /outputs group2 \
--license_key "XXXXXXXX"
Also see *--parcellations* and *--measurements* arguments.
13 changes: 13 additions & 0 deletions circle.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
general:
artifacts:
- "~/outputs1"
- "~/outputs2"

machine:
services:
- docker #don't use 1.10 - caching is broken
Expand All @@ -10,6 +15,9 @@ dependencies:
override:
- if [[ -e ~/docker/image.tar ]]; then docker load -i ~/docker/image.tar; fi
- if [[ ! -d ~/data/ds114_test1 ]]; then wget -c -O ${HOME}/ds114_test1.tar "https://files.osf.io/v1/resources/9q7dv/providers/osfstorage/57e54a326c613b01d7d3ed90" && mkdir -p ${HOME}/data && tar xf ${HOME}/ds114_test1.tar -C ${HOME}/data; fi
- if [[ ! -d ~/data/ds114_test2 ]]; then wget -c -O ${HOME}/ds114_test2.tar "https://files.osf.io/v1/resources/9q7dv/providers/osfstorage/57e549f9b83f6901d457d162" && mkdir -p ${HOME}/data && tar xf ${HOME}/ds114_test2.tar -C ${HOME}/data; fi
- if [[ ! -d ~/data/ds114_test1_freesurfer ]]; then wget -c -O ${HOME}/ds114_test1_freesurfer.tar "https://files.osf.io/v1/resources/9q7dv/providers/osfstorage/5882adf3b83f6901f564da49" && mkdir -p ${HOME}/data && tar xf ${HOME}/ds114_test1_freesurfer.tar -C ${HOME}/data; fi
- if [[ ! -d ~/data/ds114_test2_freesurfer ]]; then wget -c -O ${HOME}/ds114_test2_freesurfer.tar "https://files.osf.io/v1/resources/9q7dv/providers/osfstorage/5882b0e3b83f6901fb64da18" && mkdir -p ${HOME}/data && tar xf ${HOME}/ds114_test2_freesurfer.tar -C ${HOME}/data; fi
- git describe --tags > version
- docker build -t bids/${CIRCLE_PROJECT_REPONAME} . :
timeout: 21600
Expand All @@ -21,6 +29,11 @@ test:
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds114_test1:/bids_dataset bids/${CIRCLE_PROJECT_REPONAME,,} --version
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds114_test1:/bids_dataset -v ${HOME}/outputs1:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} /bids_dataset /outputs participant --participant_label 01 --license_key="~/test.key" --stages autorecon1:
timeout: 21600
# group2 tests
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds114_test1:/bids_dataset -v ${HOME}/data/ds114_test1_freesurfer:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} /bids_dataset /outputs group2 --license_key="~/test.key" && mkdir -p ${HOME}/outputs1/ && sudo mv ${HOME}/data/ds114_test1_freesurfer/00_group* ${HOME}/outputs1/ && cat ${HOME}/outputs1/00_group2_stats_tables/lh.aparc.thickness.tsv :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds114_test2:/bids_dataset -v ${HOME}/data/ds114_test2_freesurfer:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} /bids_dataset /outputs group2 --license_key="~/test.key" && mkdir -p ${HOME}/outputs2/ && sudo mv ${HOME}/data/ds114_test2_freesurfer/00_group* ${HOME}/outputs2/ && cat ${HOME}/outputs2/00_group2_stats_tables/lh.aparc.thickness.tsv :
timeout: 21600

deployment:
hub:
Expand Down
Loading