diff --git a/docs/infrastructure.md b/docs/infrastructure.md index b8876bac..3b24f930 100644 --- a/docs/infrastructure.md +++ b/docs/infrastructure.md @@ -84,15 +84,24 @@ We have tested the following options: We recommend launching the Neurobagel API and your graph backend instance using `docker compose`. (To install the API from source, see [these instructions](https://github.com/neurobagel/api/blob/main/README.md#local-installation).) -### Clone the API repo +### Clone the configuration file templates + +The [`neurobagel/recipes`](https://github.com/neurobagel/recipes) repository contains templates of all files needed for configuring different types of Neurobagel deployments. + +Configuration files for setting up a single Neurobagel node are found in the [`local_node`](https://github.com/neurobagel/recipes/tree/main/local_node) subdirectory. +**You can follow the below steps directly in this subdirectory, or in a new directory outside of the repository.** ```bash -git clone https://github.com/neurobagel/api.git +git clone https://github.com/neurobagel/recipes.git +cd recipes/local_node ``` ### Set the environment variables -Create a `.env` file in the root of the repository to house the environment variables used by the Neurobagel API-graph network. +Create a `.env` file to house the environment variables used by the Neurobagel API-graph network. -The `neurobagel/api` repo contains a [`.template-env`](https://github.com/neurobagel/api/blob/main/.template-env) to get you started. Copy and rename this file to `.env` and then edit it as needed. +The `neurobagel/recipes` repo contains a +[`template.env`](https://github.com/neurobagel/recipes/blob/main/local_node/template.env) +in [the `local_node` recipe](https://github.com/neurobagel/recipes/tree/main/local_node) +to get you started. Copy and rename this file to `.env` and then edit it as needed. Below are all the possible Neurobagel environment variables that can be set in `.env`. @@ -151,7 +160,7 @@ For a local deployment, we recommend to **explicitly set** at least the followin To spin up the Neurobagel API and graph backend containers using Docker Compose, ensure that both [docker](https://docs.docker.com/get-docker/) and [docker compose](https://docs.docker.com/compose/install/) are installed. -Run the following in the repository root (where the `docker-compose.yml` file is) to launch the containers: +Run the following in the directory containing both the `docker-compose.yml` file from [the local_nodes recipe](https://github.com/neurobagel/recipes/tree/main/local_node) and the .env file you just created. !!! tip Double check that any environment variables you have customized in `.env` are resolved with your expected values using the command `docker compose config`. @@ -418,7 +427,8 @@ with a name of `test_data`. ## Uploading data to the graph -The `neurobagel/api` repo contains a helper script [`add_data_to_graph.sh`](https://github.com/neurobagel/api/blob/main/add_data_to_graph.sh) for automatically uploading all JSONLD and/or TTL files (i.e., graph-ready data) in a directory to a specific graph database, with the option to clear the existing data in the database first. +The `neurobagel/recipes` repo contains a helper script [`add_data_to_graph.sh`](https://github.com/neurobagel/recipes/blob/main/scripts/add_data_to_graph.sh) in the `scripts` subdirectory for automatically uploading all JSONLD and/or TTL files (i.e., graph-ready data) in a directory to a specific graph database, +with the option to clear the existing data in the database first. In the context of Neurobagel, each `.jsonld` file is expected to correspond to a single **dataset**. To view all the command line arguments for add_data_to_graph.sh: @@ -522,24 +532,24 @@ You can choose to omit the flag or explicitly specify `--no-clear-data` (default The participant variables modeled by Neurobagel are named using Neurobagel's own vocabulary (for more information, see this page on [controlled terms](./term_naming_standards.md)). This vocabulary, which defines internal relationships between vocabulary terms, -is serialized in the file [`nb_vocab.ttl`](https://github.com/neurobagel/api/blob/main/vocab/nb_vocab.ttl) available from the `neurobagel/api` repository. -If you have cloned the api repository, you will already have downloaded the vocabulary file. +is serialized in the file [`nb_vocab.ttl`](https://github.com/neurobagel/recipes/blob/main/vocab/nb_vocab.ttl) available from the `neurobagel/recipes` repository. +If you have cloned this repository, you will already have downloaded the vocabulary file. **The `nb_vocab.ttl` file should be added to every created Neurobagel graph database.** -This can be done using the same script we used to upload the dataset JSONLD files, [`add_data_to_graph.sh`](https://github.com/neurobagel/api/blob/main/add_data_to_graph.sh), which adds all `.ttl` and/or `.jsonld` files in a given directory to the specified graph. +This can be done using the same script we used to upload the dataset JSONLD files, [`add_data_to_graph.sh`](https://github.com/neurobagel/recipes/blob/main/scripts/add_data_to_graph.sh), which adds all `.ttl` and/or `.jsonld` files in a given directory to the specified graph. -Run the following code (assumes you are in the `api` directory): +Run the following code (assumes you are in the `scripts` subdirectory inside the `recipes` repository): === "GraphDB" ``` bash - ./add_data_to_graph.sh vocab \ + ./add_data_to_graph.sh ../vocab \ localhost:7200 repositories/my_db DBUSER DBPASSWORD \ --use-graphdb-syntax ``` === "Stardog" ``` bash - ./add_data_to_graph.sh vocab \ + ./add_data_to_graph.sh ../vocab \ localhost:5820 test_data DBUSER DBPASSWORD ``` @@ -639,9 +649,13 @@ The `NB_API_ALLOWED_ORIGINS` variable defaults to an empty string (`""`) when un To make the Neurobagel API accessible by a frontend tool such as our [browser query tool](https://github.com/neurobagel/query-tool), you must explicitly specify the origin(s) for the frontend using `NB_API_ALLOWED_ORIGINS` in `.env`. -For detailed instructions regarding the query tool see [Running cohort queries](query_tool.md). +(Detailed instructions for using the query tool can be found in [Running cohort queries](query_tool.md).) + +For example, add the following line to your `.env` file to allow API requests from a query tool hosted at a specific port on `localhost` (see the [Docker Compose section](#docker-compose)). -For example, the [`.template-env`](https://github.com/neurobagel/api/blob/main/.template-env) file in the Neurobagel API repo assumes you want to allow API requests from a query tool hosted at a specific port on `localhost` (see the [Docker Compose section](#docker-compose)). +```bash +NB_API_ALLOWED_ORIGINS="http://localhost:3000 http://127.0.0.1:3000" +``` !!! example "More examples of `NB_API_ALLOWED_ORIGINS`"