Skip to content

Commit

Permalink
Fix minor typos (#43)
Browse files Browse the repository at this point in the history
  • Loading branch information
mribeirodantas authored Dec 8, 2024
1 parent bcc1e37 commit b6c1ae0
Show file tree
Hide file tree
Showing 5 changed files with 11 additions and 11 deletions.
6 changes: 3 additions & 3 deletions docs/params.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,9 +93,9 @@ Reference genome related files and options required for the workflow.

| Parameter | Description | Type | Default | Required | Hidden |
|-----------|-----------|-----------|-----------|-----------|-----------|
| `download_functional` | Whether to dowload functional references | `boolean` | True | | |
| `download_kaiju` | Whether to dowload the Kaiju reference db | `boolean` | True | | |
| `download_kraken` | Whether to dowload the Kraken2 reference db | `boolean` | | | |
| `download_functional` | Whether to download functional references | `boolean` | True | | |
| `download_kaiju` | Whether to download the Kaiju reference db | `boolean` | True | | |
| `download_kraken` | Whether to download the Kraken2 reference db | `boolean` | | | |
| `download_host` | Whether to download the host reference genome | `boolean` | | | |
| `functional_db` | Functional reference URL (download entry) | `string` | https://ftp.ncbi.nlm.nih.gov/blast/db/FASTA/nr.gz | | |
| `functional_dictionary` | Functional dictionary URL (download entry) | `string` | https://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/idmapping/idmapping.dat.gz | | |
Expand Down
4 changes: 2 additions & 2 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ Several generic profiles are bundled with the pipeline which instruct the pipeli
Note that multiple profiles can be loaded, for example: `-profile test,docker` - the order of arguments is important!
They are loaded in sequence, so later profiles can overwrite earlier profiles.

If `-profile` is not specified, the pipeline will run locally and expect all software to be installed and available on the `PATH`. This is _not_ recommended, since it can lead to different results on different machines dependent on the computer enviroment.
If `-profile` is not specified, the pipeline will run locally and expect all software to be installed and available on the `PATH`. This is _not_ recommended, since it can lead to different results on different machines dependent on the computer environment.

- `test`
- A profile with a complete configuration for automated testing
Expand Down Expand Up @@ -164,7 +164,7 @@ Tip: you can replicate the issue by changing to the process work dir and enterin

#### For beginners

A first step to bypass this error, you could try to increase the amount of CPUs, memory, and time for the whole pipeline. Therefor you can try to increase the resource for the parameters `--max_cpus`, `--max_memory`, and `--max_time`. Based on the error above, you have to increase the amount of memory. Therefore you can go to the [parameter documentation of rnaseq](https://nf-co.re/rnaseq/3.9/parameters) and scroll down to the `show hidden parameter` button to get the default value for `--max_memory`. In this case 128GB, you than can try to run your pipeline again with `--max_memory 200GB -resume` to skip all process, that were already calculated. If you can not increase the resource of the complete pipeline, you can try to adapt the resource for a single process as mentioned below.
A first step to bypass this error, you could try to increase the amount of CPUs, memory, and time for the whole pipeline. Therefore you can try to increase the resource for the parameters `--max_cpus`, `--max_memory`, and `--max_time`. Based on the error above, you have to increase the amount of memory. Therefore you can go to the [parameter documentation of rnaseq](https://nf-co.re/rnaseq/3.9/parameters) and scroll down to the `show hidden parameter` button to get the default value for `--max_memory`. In this case 128GB, you than can try to run your pipeline again with `--max_memory 200GB -resume` to skip all process, that were already calculated. If you can not increase the resource of the complete pipeline, you can try to adapt the resource for a single process as mentioned below.

#### Advanced option on process level

Expand Down
2 changes: 1 addition & 1 deletion lib/WorkflowEuryale.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ class WorkflowEuryale {
}

public static String methodsDescriptionText(run_workflow, mqc_methods_yaml) {
// Convert to a named map so can be used as with familar NXF ${workflow} variable syntax in the MultiQC YML file
// Convert to a named map so that it can be used with a familiar NXF ${workflow} variable syntax in the MultiQC YML file
def meta = [:]
meta.workflow = run_workflow.toMap()
meta["manifest_map"] = run_workflow.manifest.toMap()
Expand Down
2 changes: 1 addition & 1 deletion nextflow.config
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ params {
skip_alignment = false
skip_microview = false

// Dowload entry options
// Download entry options
download_functional = true
download_kaiju = true
download_kraken = false
Expand Down
8 changes: 4 additions & 4 deletions nextflow_schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -242,16 +242,16 @@
"download_functional": {
"type": "boolean",
"default": true,
"description": "Whether to dowload functional references"
"description": "Whether to download functional references"
},
"download_kaiju": {
"type": "boolean",
"default": true,
"description": "Whether to dowload the Kaiju reference db"
"description": "Whether to download the Kaiju reference db"
},
"download_kraken": {
"type": "boolean",
"description": "Whether to dowload the Kraken2 reference db"
"description": "Whether to download the Kraken2 reference db"
},
"download_host": {
"type": "boolean",
Expand Down Expand Up @@ -471,4 +471,4 @@
"$ref": "#/definitions/generic_options"
}
]
}
}

0 comments on commit b6c1ae0

Please sign in to comment.