Skip to content

Commit

Permalink
[DOC] simplifying the install doc
Browse files Browse the repository at this point in the history
  • Loading branch information
bclenet committed Sep 20, 2023
1 parent bfda182 commit 4e9ecba
Showing 1 changed file with 44 additions and 43 deletions.
87 changes: 44 additions & 43 deletions INSTALL.md
Original file line number Diff line number Diff line change
@@ -1,82 +1,83 @@
# How to install NARPS Open Pipelines ?

## 1 - Get the code
## 1 - Fork the repository

First, [fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) the repository, so you have your own working copy of it.
[Fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) the repository, so you have your own working copy of it.

Then, you have two options to [clone](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) the project :
## 2 - Clone the code

### Option 1: Using DataLad (recommended)
First, install [Datalad](https://www.datalad.org/). This will allow you to get the code as well as "links" to the data, because the NARPS data is bundled in this repository as [datalad subdatasets](http://handbook.datalad.org/en/latest/basics/101-106-nesting.html).

Cloning the fork using [Datalad](https://www.datalad.org/) will allow you to get the code as well as "links" to the data, because the NARPS data is bundled in this repository as [datalad subdatasets](http://handbook.datalad.org/en/latest/basics/101-106-nesting.html).
Then, [clone](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) the project :

```bash
datalad install --recursive https://github.com/YOUR_GITHUB_USERNAME/narps_open_pipelines.git
```

### Option 2: Using Git
> [!WARNING]
> It is still possible to clone the fork using [git](https://git-scm.com/) ; but doing this, you will only get the code, loosing the links to the data.

Check failure on line 18 in INSTALL.md

View workflow job for this annotation

GitHub Actions / Check for spelling errors

loosing ==> losing
> ```bash
> git clone https://github.com/YOUR_GITHUB_USERNAME/narps_open_pipelines.git
> ```
Cloning the fork using [git](https://git-scm.com/) ; by doing this, you will only get the code.
## 3 - Get the data
```bash
git clone https://github.com/YOUR_GITHUB_USERNAME/narps_open_pipelines.git
```

## 2 - Get the data
Now that you cloned the repository using Datalad, you are able to get the data :
Ignore this step if you used DataLad (option 1) in the previous step.
```bash
# Move inside the root directory of the repository.
cd narps_open_pipelines
Otherwise, there are several ways to get the data.
# Select the data you want to download. Here is an example to get data of the first 4 subjects.
datalad get data/original/ds001734/sub-00[1-4] -J 12
datalad get data/original/ds001734/derivatives/fmriprep/sub-00[1-4] -J 12
```
## 3 - Set up the environment
> [!NOTE]
> For further information and alternatives on how to get the data, see the corresponding documentation page [docs/data.md](docs/data.md).
The Narps Open Pipelines project is build upon several dependencies, such as [Nipype](https://nipype.readthedocs.io/en/latest/) but also the original software packages used by the pipelines (SPM, FSL, AFNI...).
## 4 - Set up the environment

To facilitate this step, we created a Docker container based on [Neurodocker](https://github.com/ReproNim/neurodocker) that contains the necessary Python packages and software. To install the Docker image, two options are available.
The Narps Open Pipelines project is build upon several dependencies, such as [Nipype](https://nipype.readthedocs.io/en/latest/) but also the original software packages used by the pipelines (SPM, FSL, AFNI...).

### Option 1: Using Dockerhub
Therefore, we created a Docker container based on [Neurodocker](https://github.com/ReproNim/neurodocker) that contains the necessary Python packages and software. [Install Docker](https://docs.docker.com/engine/install/) then pull the Docker image :

```bash
docker pull elodiegermani/open_pipeline:latest
```

The image should install itself. Once it's done you can check the image is available on your system:
Once it's done you can check the image is available on your system :

```bash
docker images
docker.io/elodiegermani/open_pipeline latest 0f3c74d28406 9 months ago 22.7 GB
```

### Option 2: Using a Dockerfile
> [!NOTE]
> Feel free to read this documentation page [docs/environment.md](docs/environment.md) to start the environment from it.
## 5 - Run the project

Start a Docker container from the Docker image :

```bash
docker run -it -v <path_to_the_repository>:/home/neuro/code/ elodiegermani/open_pipeline
```

The Dockerfile used to create the image stored on DockerHub is available at the root of the repository ([Dockerfile](Dockerfile)). But you might want to personalize this Dockerfile. To do so, change the command below that will generate a new Dockerfile:
Install NARPS Open Pipeline inside the container :

```bash
docker run --rm repronim/neurodocker:0.7.0 generate docker \
--base neurodebian:stretch-non-free --pkg-manager apt \
--install git \
--fsl version=6.0.3 \
--afni version=latest method=binaries install_r=true install_r_pkgs=true install_python2=true install_python3=true \
--spm12 version=r7771 method=binaries \
--user=neuro \
--workdir /home \
--miniconda create_env=neuro \
conda_install="python=3.8 traits jupyter nilearn graphviz nipype scikit-image" \
pip_install="matplotlib" \
activate=True \
--env LD_LIBRARY_PATH="/opt/miniconda-latest/envs/neuro:$LD_LIBRARY_PATH" \
--run-bash "source activate neuro" \
--user=root \
--run 'chmod 777 -Rf /home' \
--run 'chown -R neuro /home' \
--user=neuro \
--run 'mkdir -p ~/.jupyter && echo c.NotebookApp.ip = \"0.0.0.0\" > ~/.jupyter/jupyter_notebook_config.py' > Dockerfile
source activate neuro
cd /home/neuro/code/
pip insall .

Check failure on line 72 in INSTALL.md

View workflow job for this annotation

GitHub Actions / Check for spelling errors

insall ==> install
```

When you are satisfied with your Dockerfile, just build the image:
Finally, you are able to run pipelines :

```bash
docker build --tag [name_of_the_image] - < Dockerfile
python narps_open/runner.py
usage: runner.py [-h] -t TEAM (-r RSUBJECTS | -s SUBJECTS [SUBJECTS ...] | -n NSUBJECTS) [-g | -f] [-c]
```

When the image is built, follow the instructions in [docs/environment.md](docs/environment.md) to start the environment from it.
> [!NOTE]
> For further information, read [docs/running.md](docs/running.md).

0 comments on commit 4e9ecba

Please sign in to comment.