Skip to content

Commit

Permalink
Merge pull request #2 from IGDRion/dev
Browse files Browse the repository at this point in the history
Merge with dev to main for TFK updates, lncRNAs noORF, bambu memory
  • Loading branch information
tderrien authored Mar 22, 2023
2 parents 9738240 + 89329e3 commit 88df06e
Show file tree
Hide file tree
Showing 8 changed files with 17 additions and 11 deletions.
1 change: 1 addition & 0 deletions ANNEXA
Submodule ANNEXA added at 2e25dd
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Introduction

**ANNEXA** is an all-in-one reproductible pipeline, written in the [Nextflow](https://nextflow.io), which allows users to analyze LR-RNAseq sequences from Oxford Nanopore Technologies (ONT), and to reconstruct and quantify known and novel genes and isoforms.
**ANNEXA** is an all-in-one reproductible pipeline, written in the [Nextflow](https://nextflow.io), which allows users to analyze LR-RNAseq data (Long-Read RNASeq), and to reconstruct and quantify known and novel genes and isoforms.

## Pipeline summary

Expand Down Expand Up @@ -41,7 +41,7 @@ nextflow run IGDRion/ANNEXA \
--fa /path/to/ref.fa
```

The input parameter takes a file listing the bams to analyze (see example below)
The input parameter takes a file listing the `bam` path files to analyze (see example below)

```
/path/to/1.bam
Expand Down
2 changes: 1 addition & 1 deletion bin/filter_gtf_ndr.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ def parse_bambu(line):

def parse_tfkmers(line):
ids = line[0].split("::")
return ids[0], ids[1], line[2]
return ids[0], ids[1], line[1]


def parse_ndr(csv, origin, th) -> Set[str]:
Expand Down
3 changes: 1 addition & 2 deletions bin/qc.R
Original file line number Diff line number Diff line change
Expand Up @@ -230,8 +230,7 @@ gene_ext_dist = gene %>%
# TRANSCRIPT
#############################################################################
transcript = read.csv(paste0(prefix,".transcript.stats"), header = T)
lncRNA_biotypes = c("retained_intron",
"lncRNA",
lncRNA_biotypes = c("lncRNA",
"antisense",
"non-coding",
"lnc_RNA")
Expand Down
3 changes: 2 additions & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@ dependencies:

- conda-forge::r-base=4.1
- conda-forge::r-rcolorbrewer
- conda-forge::r-tidyverse
- conda-forge::r-tidyverse=1.3.2
- conda-forge::r-dplyr=1.0.10
- conda-forge::r-reshape2
- conda-forge::r-ggpubr
- conda-forge::r-ggridges
Expand Down
5 changes: 5 additions & 0 deletions modules/feelnc/codpot.nf
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,11 @@ process FEELNC_CODPOT {
-l known_lncRNA.gtf \
--numtx=3000,3000 \
-o new
# consider new noORF transcripts as new lncRNA
if [ -e feelnc_codpot_out/new.noORF.gtf ]; then
cat feelnc_codpot_out/new.noORF.gtf >> feelnc_codpot_out/new.lncRNA.gtf
fi
"""
}

6 changes: 3 additions & 3 deletions modules/index_bam.nf
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
process INDEX_BAM {
conda (params.enable_conda ? "bioconda::samtools=1.15.1" : null)
conda (params.enable_conda ? "bioconda::samtools=1.16.1" : null)
container "${ workflow.containerEngine == 'singularity' ?
'https://depot.galaxyproject.org/singularity/samtools:1.15.1--h1170115_0' :
'quay.io/biocontainers/samtools:1.15.1--h1170115_0' }"
'https://depot.galaxyproject.org/singularity/samtools%3A1.16.1--h6899075_0' :
'quay.io/biocontainers/samtools:1.16.1--h1170115_0' }"

input:
file bam
Expand Down
4 changes: 2 additions & 2 deletions nextflow.config
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ params {
outdir = "results"
withGeneCoverage = false
maxCpu = 8
maxMemory = "40GB"
maxMemory = "80GB"
enable_conda = false
filter = false
tfkmers_threshold = 0.2
Expand All @@ -14,7 +14,7 @@ params {
}

process {
memory = '8GB'
memory = '16GB'
}

profiles {
Expand Down

0 comments on commit 88df06e

Please sign in to comment.