Skip to content

Commit

Permalink
Merge pull request #381 from BU-ISCIII/develop
Browse files Browse the repository at this point in the history
Develop merge for 2.2.3 release
  • Loading branch information
victor5lm authored Dec 23, 2024
2 parents 86556c4 + fd589bb commit 6d9f0a2
Show file tree
Hide file tree
Showing 115 changed files with 373 additions and 327 deletions.
36 changes: 35 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,52 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [2.X.Xdev] - 2024-0X-XX : https://github.com/BU-ISCIII/buisciii-tools/releases/tag/2.X.X
## [2.X.Xhot] - 2024-0X-0X : https://github.com/BU-ISCIII/buisciii-tools/releases/tag/2.X.3

### Credits

Code contributions to the hotfix:

### Template fixes and updates

### Modules

#### Added enhancements

#### Fixes

#### Changed

#### Removed

### Requirements

## [2.2.3] - 2024-12-23 : https://github.com/BU-ISCIII/buisciii-tools/releases/tag/2.2.3

### Credits

Code contributions to the new version:

- [Victor Lopez](https://github.com/victor5lm)
- [Sarai Varona](https://github.com/svarona)

### Template fixes and updates

- Updated sftp_user.json, added the locus-tag option for the PROKKA process in the bacass config file and changed new_service.py so that integrity is checked only for the samples of interest [#363](https://github.com/BU-ISCIII/buisciii-tools/pull/363).
- Replaced /data/bi/ by /data/ucct/bi/ [#380](https://github.com/BU-ISCIII/buisciii-tools/pull/380).
- Updated bacass version in all pertinent files [#380](https://github.com/BU-ISCIII/buisciii-tools/pull/380).
- Updated read length variable definition when creating the mapping_illumina.tab file [#380](https://github.com/BU-ISCIII/buisciii-tools/pull/380).
- Updated create_irma_stats.sh to include %mapped_reads [#380](https://github.com/BU-ISCIII/buisciii-tools/pull/380).
- Changed "Buenas" by "Estimado/a" in email.j2 [#380](https://github.com/BU-ISCIII/buisciii-tools/pull/380).

### Modules

#### Added enhancements

#### Fixes

- Fixed new-service to correctly handle when there are no samples in service [#372](https://github.com/BU-ISCIII/buisciii-tools/pull/372). Fixes issue [#371](https://github.com/BU-ISCIII/buisciii-tools/issues/371)

#### Changed

#### Removed
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,13 +202,13 @@ Options:
-a, --ask_path Please ask for service path.
-t, --tmp_dir PATH Directory to which the files will be
transfered for execution. Default:
/data/bi/scratch_tmp/bi/
/data/ucct/bi/scratch_tmp/bi/
-d, --direction [service_to_scratch|scratch_to_service|remove_scratch]
Direction of the rsync command.
service_to_scratch from /data/bi/service to
/data/bi/scratch_tmp/bi/.scratch_to_service:
From /data/bi/scratch_tmp/bi/ to
/data/bi/service
service_to_scratch from /data/ucct/bi/service to
/data/ucct/bi/scratch_tmp/bi/.scratch_to_service:
From /data/ucct/bi/scratch_tmp/bi/ to
/data/ucct/bi/service
--help Show this message and exit.
```

Expand Down
12 changes: 6 additions & 6 deletions bu_isciii/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def run_bu_isciii():
)

# stderr.print("[green] `._,._,'\n", highlight=False)
__version__ = "2.2.2"
__version__ = "2.2.3"
stderr.print(
"[grey39] BU-ISCIII-tools version {}".format(__version__), highlight=False
)
Expand Down Expand Up @@ -252,8 +252,8 @@ def new_service(ctx, resolution, path, no_create_folder, ask_path):
"-t",
"--tmp_dir",
type=click.Path(),
default="/data/bi/scratch_tmp/bi/",
help="Directory to which the files will be transfered for execution. Default: /data/bi/scratch_tmp/bi/",
default="/data/ucct/bi/scratch_tmp/bi/",
help="Directory to which the files will be transfered for execution. Default: /data/ucct/bi/scratch_tmp/bi/",
)
@click.option(
"-d",
Expand All @@ -262,8 +262,8 @@ def new_service(ctx, resolution, path, no_create_folder, ask_path):
multiple=False,
help=(
"Direction of the rsync command. service_to_scratch "
"from /data/bi/service to /data/bi/scratch_tmp/bi/."
"scratch_to_service: From /data/bi/scratch_tmp/bi/ to /data/bi/service"
"from /data/ucct/bi/service to /data/ucct/bi/scratch_tmp/bi/."
"scratch_to_service: From /data/ucct/bi/scratch_tmp/bi/ to /data/ucct/bi/service"
),
)
@click.pass_context
Expand Down Expand Up @@ -411,7 +411,7 @@ def copy_sftp(ctx, resolution, path, ask_path, sftp_folder):
"-t",
"--tmp_dir",
type=click.Path(),
default="/data/bi/scratch_tmp/bi/",
default="/data/ucct/bi/scratch_tmp/bi/",
help="Absolute path to the scratch directory containing the service.",
)
@click.pass_context
Expand Down
2 changes: 1 addition & 1 deletion bu_isciii/bioinfo_doc.py
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ def __init__(
def load_versions(self):
"""Load and parse the versions.yml file."""
result = subprocess.run(
f"find /data/bi/services_and_colaborations/*/*/{self.service_name} -name '*versions.yml'",
f"find /data/ucct/bi/services_and_colaborations/*/*/{self.service_name} -name '*versions.yml'",
stdout=subprocess.PIPE,
text=True,
shell=True,
Expand Down
2 changes: 1 addition & 1 deletion bu_isciii/conf/configuration.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"global": {
"data_path": "/data/bi",
"data_path": "/data/ucct/bi",
"archived_path": "/archived/bi",
"yaml_conf_path": "~/buisciii_config.yml",
"permissions": {
Expand Down
2 changes: 1 addition & 1 deletion bu_isciii/conf/configuration_dev.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"global": {
"data_path": "tests/data/bi",
"data_path": "tests/data/ucct/bi",
"archived_path": "tests/archived/bi",
"yaml_conf_path": "~/buisciii_config.yml"
},
Expand Down
23 changes: 16 additions & 7 deletions bu_isciii/new_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def __init__(
self.services_requested = self.resolution_info["resolutions"][0][
"available_services"
]
self.service_samples = self.resolution_info.get("samples", None)
self.service_samples = self.resolution_info.get("samples")

if ask_path and path is None:
stderr.print("Directory where you want to create the service folder.")
Expand Down Expand Up @@ -103,12 +103,19 @@ def check_md5(self):
md5_dir = os.path.dirname(md5_file_path)
os.chdir(md5_dir)

# Regex pattern to match sample names in .fastq.gz files
sample_names_pattern = "|".join(
[
f"{sample['sample_name']}.*\\.fastq\\.gz"
for sample in self.service_samples
]
)

# md5sum command
stderr.print(f"[blue]Checking MD5 integrity for {md5_file_path}")
try:
subprocess.run(
["md5sum", "-c", os.path.basename(md5_file_path)], check=True
)
cmd = f"grep -E '{sample_names_pattern}' {md5_file_path} | md5sum -c"
subprocess.run(cmd, shell=True, check=True, executable="/bin/bash")
stderr.print("[green]MD5 check passed!")
except subprocess.CalledProcessError as e:
stderr.print(f"[red]ERROR: MD5 check failed: {e.stderr}")
Expand All @@ -117,7 +124,6 @@ def check_md5(self):
os.chdir(original_dir)

def create_folder(self):
self.check_md5()
if not self.no_create_folder:
stderr.print(
"[blue]I will create the service folder for " + self.resolution_id + "!"
Expand Down Expand Up @@ -251,7 +257,8 @@ def samples_json(self):
f.close()

def create_new_service(self):
if self.service_samples is not None:
if len(self.service_samples) > 0:
self.check_md5()
self.create_folder()
self.copy_template()
self.create_samples_id()
Expand All @@ -270,7 +277,9 @@ def create_new_service(self):
stderr.print(
"[yellow]WARN: No samples recorded in service: " + self.resolution_id
)
if bu_isciii.utils.prompt_yn_question("Do you want to proceed?: "):
if bu_isciii.utils.prompt_yn_question(
"Do you want to proceed?: ", dflt=True
):
self.create_folder()
self.copy_template()
if self.resolution_info["service_state"] != "in_progress":
Expand Down
4 changes: 2 additions & 2 deletions bu_isciii/schemas/deliver_automatization.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"type": "object",
"properties": {
"destination": {
"example": "/data/bi/sftp/Labvirusres",
"example": "/data/ucct/bi/sftp/Labvirusres",
"description": "Folder at which the command will be executed",
"type": "string"
},
Expand All @@ -24,7 +24,7 @@
"type": "string"
},
"source": {
"example": "/data/bi/services_and_colaborations/CNM/virologia/",
"example": "/data/ucct/bi/services_and_colaborations/CNM/virologia/",
"description": "Folder from which the files are comming from",
"type": "string"
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ mkdir logs

scratch_dir=$(echo $PWD | sed "s/\/data\/bi\/scratch_tmp/\/scratch/g")

cat ../samples_id.txt | while read in; do echo "mkdir $in; srun --partition short_idx --cpus-per-task 8 --time 01:00:00 --chdir $scratch_dir --output logs/FASTQC.${in}.%j.log singularity exec -B ${scratch_dir}/../../../ -B /srv/fastq_repo/ /data/bi/pipelines/singularity-images/fastqc:0.11.9--hdfd78af_1 fastqc -o ${scratch_dir}/$in --nogroup -t 8 -k 8 ${scratch_dir}/../00-reads/"$in"_R1.fastq.gz ${scratch_dir}/../00-reads/"$in"_R2.fastq.gz &"; done > _01_rawfastqc.sh
cat ../samples_id.txt | while read in; do echo "mkdir $in; srun --partition short_idx --cpus-per-task 8 --time 01:00:00 --chdir $scratch_dir --output logs/FASTQC.${in}.%j.log singularity exec -B ${scratch_dir}/../../../ -B /srv/fastq_repo/ /data/ucct/bi/pipelines/singularity-images/fastqc:0.11.9--hdfd78af_1 fastqc -o ${scratch_dir}/$in --nogroup -t 8 -k 8 ${scratch_dir}/../00-reads/"$in"_R1.fastq.gz ${scratch_dir}/../00-reads/"$in"_R2.fastq.gz &"; done > _01_rawfastqc.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# module load singularity
mkdir logs
scratch_dir=$(echo $(pwd) | sed 's@/data/bi/scratch_tmp/@/scratch/@g')
cat ../samples_id.txt | xargs -I @@ echo "mkdir @@; srun --chdir ${scratch_dir} --mem 10G --time 1:00:00 --job-name FP.@@ --output logs/FP.@@.%j.log --partition short_idx --cpus-per-task 5 singularity exec -B ${scratch_dir}/../../../ -B /srv/fastq_repo/ /data/bi/pipelines/singularity-images/fastp:0.20.0--hdbcaa40_0 fastp --in1 ${scratch_dir}/../00-reads/@@_R1.fastq.gz --in2 ${scratch_dir}/../00-reads/@@_R2.fastq.gz --thread 5 --cut_front --cut_tail --cut_mean_quality 15 --qualified_quality_phred 15 --trim_poly_x --length_required 50 --detect_adapter_for_pe --json ${scratch_dir}/@@/@@_fastp.json --html ${scratch_dir}/@@/@@_fastp.html --out1 ${scratch_dir}/@@/@@_R1_filtered.fastq.gz --out2 ${scratch_dir}/@@/@@_R2_filtered.fastq.gz --unpaired1 ${scratch_dir}/@@/@@_R1_unpaired.fastq.gz --unpaired2 ${scratch_dir}/@@/@@_R2_unpaired.fastq.gz &" > _01_fastp.sh
scratch_dir=$(echo $(pwd) | sed 's@/data/ucct/bi/scratch_tmp/@/scratch/@g')
cat ../samples_id.txt | xargs -I @@ echo "mkdir @@; srun --chdir ${scratch_dir} --mem 10G --time 1:00:00 --job-name FP.@@ --output logs/FP.@@.%j.log --partition short_idx --cpus-per-task 5 singularity exec -B ${scratch_dir}/../../../ -B /srv/fastq_repo/ /data/ucct/bi/pipelines/singularity-images/fastp:0.20.0--hdbcaa40_0 fastp --in1 ${scratch_dir}/../00-reads/@@_R1.fastq.gz --in2 ${scratch_dir}/../00-reads/@@_R2.fastq.gz --thread 5 --cut_front --cut_tail --cut_mean_quality 15 --qualified_quality_phred 15 --trim_poly_x --length_required 50 --detect_adapter_for_pe --json ${scratch_dir}/@@/@@_fastp.json --html ${scratch_dir}/@@/@@_fastp.html --out1 ${scratch_dir}/@@/@@_R1_filtered.fastq.gz --out2 ${scratch_dir}/@@/@@_R2_filtered.fastq.gz --unpaired1 ${scratch_dir}/@@/@@_R1_unpaired.fastq.gz --unpaired2 ${scratch_dir}/@@/@@_R2_unpaired.fastq.gz &" > _01_fastp.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ mkdir logs

scratch_dir=$(echo $PWD | sed "s/\/data\/bi\/scratch_tmp/\/scratch/g")

cat ../samples_id.txt | while read in; do echo "mkdir $in; srun --partition short_idx --chdir $scratch_dir --output logs/FASTQC.${in}.%j.log singularity exec -B ${scratch_dir}/../../../ /data/bi/pipelines/singularity-images/fastqc:0.11.9--hdfd78af_1 fastqc -o ${scratch_dir}/$in --nogroup -t 8 -k 8 ${scratch_dir}/../02-preprocessing/${in}/${in}_R1_filtered.fastq.gz ${scratch_dir}/../02-preprocessing/${in}/${in}_R2_filtered.fastq.gz &"; done > _01_rawfastqc.sh
cat ../samples_id.txt | while read in; do echo "mkdir $in; srun --partition short_idx --chdir $scratch_dir --output logs/FASTQC.${in}.%j.log singularity exec -B ${scratch_dir}/../../../ /data/ucct/bi/pipelines/singularity-images/fastqc:0.11.9--hdfd78af_1 fastqc -o ${scratch_dir}/$in --nogroup -t 8 -k 8 ${scratch_dir}/../02-preprocessing/${in}/${in}_R1_filtered.fastq.gz ${scratch_dir}/../02-preprocessing/${in}/${in}_R2_filtered.fastq.gz &"; done > _01_rawfastqc.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ do
SAMPLE_ID=$(echo ${in})
TOTAL_READS=$(grep '1-initial' ${in}/tables/READ_COUNTS.txt | cut -f2)
MAPPEDREADS=$(grep '3-match' ${in}/tables/READ_COUNTS.txt | cut -f2)
PCTMAPPED=$(awk "BEGIN {printf \"%.2f\", ($MAPPEDREADS/$TOTAL_READS)*100}")
FLU_TYPE=$(paste <(grep '4-[A-C]_MP' ${in}/tables/READ_COUNTS.txt | cut -f1 | cut -d '_' -f1 | cut -d '-' -f2) <(grep '4-[A-B]_HA' ${in}/tables/READ_COUNTS.txt | cut -f1 | cut -d '_' -f3 | cut -d '-' -f2) <(grep '4-[A-B]_NA' ${in}/tables/READ_COUNTS.txt | cut -f1 | cut -d '_' -f3) | tr '\t' '_')
HA=$(grep '4-[A-C]_HA' ${in}/tables/READ_COUNTS.txt | cut -f2)
MP=$(grep '4-[A-C]_MP' ${in}/tables/READ_COUNTS.txt | cut -f2)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ mkdir logs

scratch_dir=$(echo $PWD | sed "s/\/data\/bi\/scratch_tmp/\/scratch/g")

cat ../samples_id.txt | while read in; do echo "srun --partition short_idx --cpus-per-task 32 --mem 35000M --chdir $scratch_dir --time 01:00:00 --output logs/IRMA.${in}.%j.log /data/bi/pipelines/flu-amd/flu-amd-1.1.4/IRMA FLU_AD ../02-preprocessing/${in}/${in}_R1_filtered.fastq.gz ../02-preprocessing/${in}/${in}_R2_filtered.fastq.gz ${in} --external-config ../../../DOC/irma_config.sh &"; done > _01_irma.sh
cat ../samples_id.txt | while read in; do echo "srun --partition short_idx --cpus-per-task 32 --mem 35000M --chdir $scratch_dir --time 01:00:00 --output logs/IRMA.${in}.%j.log /data/ucct/bi/pipelines/flu-amd/flu-amd-1.1.4/IRMA FLU_AD ../02-preprocessing/${in}/${in}_R1_filtered.fastq.gz ../02-preprocessing/${in}/${in}_R2_filtered.fastq.gz ${in} --external-config ../../../DOC/irma_config.sh &"; done > _01_irma.sh

echo 'bash create_irma_stats.sh' > _02_create_stats.sh

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ cat <<EOF > assembly.sbatch
# module load Nextflow/23.10.0 singularity
export NXF_OPTS="-Xms500M -Xmx8G"

nextflow run /data/bi/pipelines/nf-core-bacass/nf-core-bacass-2.3.1/main.nf \\
nextflow run /data/ucct/bi/pipelines/nf-core-bacass/nf-core-bacass-2.4.0/2_4_0/main.nf \\
-c ../../DOC/hpc_slurm_assembly.config \\
-profile singularity \\
--input samplesheet.csv \\
Expand All @@ -119,8 +119,8 @@ nextflow run /data/bi/pipelines/nf-core-bacass/nf-core-bacass-2.3.1/main.nf \\
--fastp_args '--qualified_quality_phred 20 --cut_mean_quality 20' \\
--skip_kraken2 true \\
--skip_kmerfinder false \\
--kmerfinderdb /data/bi/references/kmerfinder/latest/bacteria \\
--ncbi_assembly_metadata /data/bi/references/bacteria/20240626/assembly_summary_refseq.txt \\
--kmerfinderdb /data/ucct/bi/references/kmerfinder/latest/bacteria \\
--ncbi_assembly_metadata /data/ucct/bi/references/bacteria/20240626/assembly_summary_refseq.txt \\
${PROKKA_ARGS} \\
-resume

Expand Down
5 changes: 3 additions & 2 deletions bu_isciii/templates/assembly/DOC/hpc_slurm_assembly.config
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
singularity {
enabled = true
autoMounts = true
singularity.cacheDir = '/data/bi/pipelines/singularity-images'
singularity.cacheDir = '/data/ucct/bi/pipelines/singularity-images'
}

process {
Expand Down Expand Up @@ -195,7 +195,8 @@ process {
ext.args = {
[
'--force',
params.prokka_args ? "${params.prokka_args}" : ''
params.prokka_args ? "${params.prokka_args}" : '',
"--locustag ${meta.id}"
].join(' ').trim()
}
publishDir = [
Expand Down
10 changes: 5 additions & 5 deletions bu_isciii/templates/blast_nt/ANALYSIS/ANALYSIS02_BLAST/lablog
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ mkdir logs
# Location of assemblies to a variable so it only has to be changed here
LOCATION=../*/*/assembly/*/*
# Other databases:
# /data/bi/references/BLAST_dbs/nt_20211025/nt
BLAST_DATABASE="/data/bi/references/virus/BLAST/all_virus.fasta"
# /data/ucct/bi/references/BLAST_dbs/nt_20211025/nt
BLAST_DATABASE="/data/ucct/bi/references/BLAST_dbs/viral_ncbi/viral_genomes_ncbi"

# if there are scaffolds, uncompress the scaffolds in its dir (zcat for decompression)
# if there contigs and no scaffolds, uncompress the contigs as scaffolds in its dir
Expand All @@ -27,7 +27,7 @@ cat ../samples_id.txt | while read in; do
done

# NOTE3: change the -query flag to meet your requirements
cat ../samples_id.txt | xargs -I %% echo "srun --chdir ${scratch_dir} --partition middle_idx --mem 200G --time 48:00:00 --cpus-per-task 10 --output logs/BLASTN_%%_%j.log --job-name BLASTN_%% singularity exec -B ${scratch_dir}/../../ -B /data/bi/references/virus/BLAST/ /data/bi/pipelines/singularity-images/blast:2.11.0--pl5262h3289130_1 blastn -num_threads 10 -db ${BLAST_DATABASE} -query ${scratch_dir}/%%/%%.scaffolds.fa -out ${scratch_dir}/%%/%%_blast.tsv -outfmt '6 qseqid stitle qaccver saccver pident length mismatch gaps qstart qend sstart send evalue bitscore slen qlen qcovs' &" > _01_blast.sh
cat ../samples_id.txt | xargs -I %% echo "srun --chdir ${scratch_dir} --partition middle_idx --mem 200G --time 48:00:00 --cpus-per-task 10 --output logs/BLASTN_%%_%j.log --job-name BLASTN_%% singularity exec -B ${scratch_dir}/../../ -B /data/ucct/bi/references/BLAST_dbs/viral_ncbi/ /data/ucct/bi/pipelines/singularity-images/blast:2.11.0--pl5262h3289130_1 blastn -num_threads 10 -db ${BLAST_DATABASE} -query ${scratch_dir}/%%/%%.scaffolds.fa -out ${scratch_dir}/%%/%%_blast.tsv -outfmt '6 qseqid stitle qaccver saccver pident length mismatch gaps qstart qend sstart send evalue bitscore slen qlen qcovs' &" > _01_blast.sh

# Filtering criteria:
# %refCovered > 0.7
Expand Down Expand Up @@ -71,5 +71,5 @@ echo "rm header" >> _03_gather_results_add_header.sh
# 20: %refCovered: length/slen

# conda activate 2excel
cat ../samples_id.txt | xargs -I %% echo "srun --chdir ${scratch_dir} --partition short_idx --mem 10G --time 1:00:00 --output logs/2excel_%%.log --job-name 2excel_%% python /data/bi/pipelines/utilities/export_excel_from_csv.py --input_file %%/%%_blast_filt.tsv --delimiter '\t' --output_filename %%/%%_blast_filt --it_has_index --it_has_header" > _04_to_excel.sh
echo "srun --chdir ${scratch_dir} --partition short_idx --mem 10G --time 1:00:00 --output logs/2excel_all.log --job-name 2excel_all python /data/bi/pipelines/utilities/export_excel_from_csv.py --input_file all_samples_filtered_BLAST_results.tsv --delimiter '\t' --output_filename all_samples_filtered_BLAST_results --it_has_index --it_has_header" >> _04_to_excel.sh
cat ../samples_id.txt | xargs -I %% echo "srun --chdir ${scratch_dir} --partition short_idx --mem 10G --time 1:00:00 --output logs/2excel_%%.log --job-name 2excel_%% python /data/ucct/bi/pipelines/utilities/export_excel_from_csv.py --input_file %%/%%_blast_filt.tsv --delimiter '\t' --output_filename %%/%%_blast_filt --it_has_index --it_has_header" > _04_to_excel.sh
echo "srun --chdir ${scratch_dir} --partition short_idx --mem 10G --time 1:00:00 --output logs/2excel_all.log --job-name 2excel_all python /data/ucct/bi/pipelines/utilities/export_excel_from_csv.py --input_file all_samples_filtered_BLAST_results.tsv --delimiter '\t' --output_filename all_samples_filtered_BLAST_results --it_has_index --it_has_header" >> _04_to_excel.sh
Loading

0 comments on commit 6d9f0a2

Please sign in to comment.