Skip to content

NBISweden/Containerized-Compute-Helm-Charts

 
 

Repository files navigation

First time setup guide

Create the global docker network

docker network create my-app-network

Setup LSAAI Mock

Generate a self-signed certificate for the LSAAI nginx proxy using this command:

openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout ./lsaai-mock/configuration/nginx/certs/nginx.key -out ./lsaai-mock/configuration/nginx/certs/nginx.crt -addext subjectAltName=DNS:aai.localhost

Setup REMS

  1. Generate jwks
cd rems
python generate_jwks.py
cd ..
  1. Initialize the database
./rems/migrate.sh
  1. Run the rems-app service
docker compose up -d rems-app
  1. Login to the REMS application (http://localhost:3000), so your user gets created in the database
  2. Initialize the application
./rems/setup.sh

At steps 2, 3 and 5 it is important not to cd into the rems directory.

Run the rest of the services

docker compose up -d

Uploading a dataset to SDA

In upload.sh there is a simple script that uploads a dataset to SDA.

Prerequisites

  • sda-admin and sda-cli in your $PATH
  • access token from http://localhost:8085/
  • s3cmd.conf file in the same directory as the script. You can just replace <access_token> in this example:
[default]
access_token = <access_token>
human_readable_sizes = True
host_bucket = http://localhost:8002
encrypt = False
guess_mime_type = True
multipart_chunk_size_mb = 50
use_https = False
access_key = jd123_lifescience-ri.eu
secret_key = jd123_lifescience-ri.eu
host_base = http://localhost:8002
check_ssl_certificate = False
encoding = UTF-8
check_ssl_hostname = False
socket_timeout = 30

Usage

  1. Move all your data to a folder next to the script
  2. Run ./upload.sh <folder_name> <dataset_name>
  3. After successful upload, you should be able to fetch the data using:
token=<access_token>
curl -H "Authorization: Bearer $token" http://localhost:8443/s3/<dataset>/jd123_lifescience-ri.eu/<folder_name>/<file_name>

Running a worfklow from the compute-web

  1. Log in to http://localhost:4180
  2. Select a workflow on the home page
  3. Define the input directory as <dataset>/<user_id> (e.g. DATASET001/jd123_lifescience-ri.eu)
  4. Define the output directory. This is the directory in the s3 inbox bucket (e.g. myWorklowOutput)
  5. Click Run

About

Fork for working on CC Helm charts

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Shell 89.7%
  • Python 8.4%
  • CSS 1.9%