Skip to content

Commit

Permalink
[REF] Provide context on uploading data
Browse files Browse the repository at this point in the history
  • Loading branch information
surchs committed May 14, 2024
1 parent e64a912 commit 6697c59
Show file tree
Hide file tree
Showing 2 changed files with 33 additions and 7 deletions.
12 changes: 6 additions & 6 deletions docs/config.md
Original file line number Diff line number Diff line change
Expand Up @@ -284,14 +284,14 @@ These are manual steps for configuring the GraphDB backend after launching the N
you can connect to the Workbench at [http://localhost:7200](http://localhost:7200).
The Workbench is well documented on the [GraphDB website](https://graphdb.ontotext.com/documentation/10.0/workbench-user-interface.html).



### Uploading data to the graph store

Data are automatically uploaded to the graph from the path specified with
the `LOCAL_GRAPH_DATA` in the `.env` configuration file. when the Neurobagel stack is launched.
Below is an example of how you would upload data manually using the script
[`add_data_to_graph.sh`](https://github.com/neurobagel/recipes/blob/main/scripts/add_data_to_graph.sh):
Data are automatically uploaded to the graph from the path specified with
the `LOCAL_GRAPH_DATA` in the `.env` configuration file when the Neurobagel stack is (re-)started.

If you instead prefer to upload data manually,
you can use the
[`add_data_to_graph.sh`](https://github.com/neurobagel/recipes/blob/main/scripts/add_data_to_graph.sh) script:

``` bash
./add_data_to_graph.sh PATH/TO/YOUR/GRAPH-DATA \
Expand Down
28 changes: 27 additions & 1 deletion docs/maintaining.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,9 +71,35 @@ docker compose --profile full_stack up -d

## Updating the data in your graph

If you have followed the [initial setup](getting_started.md)
and have deployed your Neurobagel node from our Docker Compose recipe,
your node will have a dedicated graph database that only contains
the data for your node.

This makes updating the data in your graph a straightforward process.
Once you have generated the updated files you want to upload,
the process to update the data in your graph is:

1. Shut down the Neurobagel node

```bash
docker compose --profile full_stack down
```

2. Update the data files in [your `LOCAL_GRAPH_DATA` directory](config.md#uploading-data-to-the-graph-store)
3. Restart the Neurobagel node

```bash
docker compose --profile full_stack up -d
```

Here are some common scenarios where you might need to update the data in your graph:

### Following a change in my _dataset_

When using Neurobagel tools on a dataset that is still undergoing data collection, you may need to update the Neurobagel annotations and/or graph-ready data for the dataset when you want to add new subjects or measurements or to correct mistakes in prior data versions.
When using Neurobagel tools on a dataset that is still undergoing data collection,
you may need to update the Neurobagel annotations and/or graph-ready data for the dataset
when you want to add new subjects or measurements or to correct mistakes in prior data versions.

For any of the below types of changes, you will need to regenerate a graph-ready `.jsonld` file for the dataset which reflects the change.

Expand Down

0 comments on commit 6697c59

Please sign in to comment.