-
Notifications
You must be signed in to change notification settings - Fork 1
Parallel preprocessing on LISA
Chris Klink edited this page Jan 20, 2021
·
7 revisions
On the LISA cluster, you can run the preprocessing in (at least) 2 ways:
- Slice-by-slice alignment in parallel, runs in series.
- Slice-by-slice alignment and runs bot in parallel.
For option 2 (which is recommended, because it is faster), you need to employ the bids_preproc_parallel_runs.py
script that is located in NHP-BIDS/code/subcode
. You can do this is part of another job file (e.g., to execute multiple steps at once) or interactively to avoid having to wait in queue twice. This script in itself will be fast and brief, but it creates jobs that will go into the queue and are ubject to SLURM scheduling rules. The script takes the following input arguments:
-
--csv
The original csv-file defining multiple runs. -
--subses
A string marker used to create subfolders. Strongly recommended to use<sub><sess>
, so e.g.Eddy20191212
. -
--no-warp
Include this flag to prevent running thebids_warp2nmt_workflow.py
as well. Omit it (recommended) and the warp will be done.
- The script will read the original csv-file and parse it.
- For each unique run in the csv-file, it creates a new csv-file that specifies only that run.
- A companion job-file is also created for each single-run csv-file and made executable (note that we reserve 10h for each job, which should be more than enough).
- The script then schedules all jobs into SLURM so they can be executed in parallel. Depending on wait time on the cluster, this might get your entire preprocessing done in <6h.