Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

single-process mpi program gives error in multi-process env. #163

Open
hajgato opened this issue Jan 16, 2020 · 1 comment
Open

single-process mpi program gives error in multi-process env. #163

hajgato opened this issue Jan 16, 2020 · 1 comment

Comments

@hajgato
Copy link
Contributor

hajgato commented Jan 16, 2020

ml vsc-mympirun/5.0.0 mpifileutils/0.9.1-iimpi-2019a

[vsc43020@node3110 vsc43020]$ mypmirun --debug dcp -h
2020-01-16 13:41:50,394 DEBUG      mypmirun.ExtOption MainThread  changed loglevel to DEBUG, previous state: (mypmirun.mympirun.factory, NOTSET), (mypmirun.ExtOptionGroup, NOTSET), (mypmirun.vsc.utils.missing, NOTSET), (mypmirun.MypmirunOption, NOTSET), (mypmirun.ExtOption, NOTSET), (mypmirun.MympirunParser, NOTSET), (mypmirun, NOTSET)
2020-01-16 13:41:50,394 DEBUG      mypmirun.MypmirunOption MainThread  parseoptions: options from environment []
2020-01-16 13:41:50,394 DEBUG      mypmirun.MypmirunOption MainThread  parseoptions: options from commandline ['--debug', 'dcp', '-h']
2020-01-16 13:41:50,394 DEBUG      mypmirun.MypmirunOption MainThread  Found remaining args ['dcp', '-h']
2020-01-16 13:41:50,394 DEBUG      mypmirun.MypmirunOption MainThread  Found options {'help': None, 'setsched': None, 'logtofile': None, 'distribute': None, 'all_gpus': None, 'pass': [], 'debugmpi': False, 'info': False, 'multi': None, 'stats': 0, 'dry_run': False, 'debuglvl': 0, 'debug': True, 'hybrid': None, 'quiet': False, 'configfiles': None, 'showsched': False, 'print_launcher': None, 'ignoreconfigfiles': None, 'output': None, 'showmpi': False, 'setmpi': None} args ['dcp', '-h']
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  Initialise case sensitive configparser
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: configfiles initially set []
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: configfiles set through commandline None
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: ignoreconfigfiles set through commandline None
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: following files were parsed []
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: following files were NOT parsed []
2020-01-16 13:41:50,395 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: sections (w/o DEFAULT) []
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: no section MAIN
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: no section ('NO', 'SECTION')
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: going to parse options through cmdline []
2020-01-16 13:41:50,396 DEBUG      mypmirun.MympirunParser MainThread  Not processing environment for options
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: options from configfile []
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  parseconfigfiles: parsed values from configfiles: {}
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  final options: {'help': None, 'setsched': None, 'logtofile': None, 'distribute': None, 'all_gpus': None, 'pass': [], 'debugmpi': False, 'info': False, 'multi': None, 'stats': 0, 'dry_run': False, 'debuglvl': 0, 'debug': True, 'hybrid': None, 'quiet': False, 'configfiles': None, 'showsched': False, 'print_launcher': None, 'ignoreconfigfiles': None, 'output': None, 'showmpi': False, 'setmpi': None}
2020-01-16 13:41:50,396 DEBUG      mypmirun.MypmirunOption MainThread  mympirun will be executed by /apps/gent/CO7/skylake-ib/software/vsc-mympirun/5.0.0/bin/mypmirun
2020-01-16 13:41:50,397 DEBUG      mypmirun        MainThread  PATH before stripfake(): /apps/gent/CO7/skylake-ib/software/mpifileutils/0.9.1-iimpi-2019a/bin:/apps/gent/CO7/skylake-ib/software/libarchive/3.4.0-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/bzip2/1.0.6-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/attr/2.4.47-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/bin64:/apps/gent/CO7/skylake-ib/software/ifort/2019.1.144-GCC-8.2.0-2.31.1/compilers_and_libraries_2019.1.144/linux/bin/intel64:/apps/gent/CO7/skylake-ib/software/icc/2019.1.144-GCC-8.2.0-2.31.1/compilers_and_libraries_2019.1.144/linux/bin/intel64:/apps/gent/CO7/skylake-ib/software/binutils/2.31.1-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/GCCcore/8.2.0/bin:/apps/gent/CO7/skylake-ib/software/vsc-mympirun/5.0.0/bin/fake:/apps/gent/CO7/skylake-ib/software/vsc-mympirun/5.0.0/bin:/usr/libexec/slurm/wrapper:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/ibutils/bin
2020-01-16 13:41:50,399 DEBUG      mypmirun        MainThread  PATH after stripfake(): /apps/gent/CO7/skylake-ib/software/mpifileutils/0.9.1-iimpi-2019a/bin:/apps/gent/CO7/skylake-ib/software/libarchive/3.4.0-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/bzip2/1.0.6-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/attr/2.4.47-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/bin64:/apps/gent/CO7/skylake-ib/software/ifort/2019.1.144-GCC-8.2.0-2.31.1/compilers_and_libraries_2019.1.144/linux/bin/intel64:/apps/gent/CO7/skylake-ib/software/icc/2019.1.144-GCC-8.2.0-2.31.1/compilers_and_libraries_2019.1.144/linux/bin/intel64:/apps/gent/CO7/skylake-ib/software/binutils/2.31.1-GCCcore-8.2.0/bin:/apps/gent/CO7/skylake-ib/software/GCCcore/8.2.0/bin:/apps/gent/CO7/skylake-ib/software/vsc-mympirun/5.0.0/bin:/usr/libexec/slurm/wrapper:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/ibutils/bin
2020-01-16 13:41:50,401 INFO       mypmirun        MainThread  Command mpirun found at /apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/bin64/mpirun
2020-01-16 13:41:50,402 DEBUG      mypmirun        MainThread  Checking whether <class 'vsc.mympirun.pmi.mpi.IntelMPI'> (MPI name: impi) matches /apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/bin64/mpirun
2020-01-16 13:41:50,403 DEBUG      mypmirun        MainThread  found mpi root: /apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1
2020-01-16 13:41:50,405 DEBUG      mypmirun        MainThread  /apps/gent/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/bin64/mpirun (real /kyukon/home/apps/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1/intel64/bin/mpirun) is in subdirectory of /kyukon/home/apps/CO7/skylake-ib/software/impi/2018.4.274-iccifort-2019.1.144-GCC-8.2.0-2.31.1
2020-01-16 13:41:50,407 DEBUG      mypmirun        MainThread  no mpirun version provided, skipping version check, match for <class 'vsc.mympirun.pmi.mpi.IntelMPI'>
2020-01-16 13:41:50,408 DEBUG      mypmirun.MypmirunOption MainThread  Found MPI classes [<class 'vsc.mympirun.pmi.mpi.IntelMPI'>, <class 'vsc.mympirun.pmi.mpi.OpenMPI4'>]
2020-01-16 13:41:50,408 DEBUG      mypmirun.MypmirunOption MainThread  Found Sched classes [<class 'slurm.Slurm'>]
2020-01-16 13:41:50,408 DEBUG      mypmirun.MypmirunOption MainThread  Found MPI class IntelMPI (scriptname mypmirun; isfake False)
2020-01-16 13:41:50,408 DEBUG      mypmirun.MypmirunOption MainThread  Found sched class Slurm from options.schedtype None (all Sched found Slurm)
2020-01-16 13:41:50,409 DEBUG      mypmirun.mympirun.factory MainThread  Created new Coupler (<class 'vsc.mympirun.factory.LogBase'>, <class 'vsc.mympirun.pmi.mpi.IntelMPI'>, <class 'slurm.Slurm'>) class for Coupler_IntelMPI_Slurm: 23018576
2020-01-16 13:41:50,410 DEBUG      mypmirun        MainThread  $EBROOTUCX not defined for UCX
2020-01-16 13:41:50,411 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  No UCX root / version found
2020-01-16 13:41:50,412 DEBUG      mypmirun        MainThread  $EBROOTFCA not defined for FCA
2020-01-16 13:41:50,413 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  No hcoll / FCA root / version found
2020-01-16 13:41:50,413 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  No MPI-specific MPI tuning
2020-01-16 13:41:50,413 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  No UCX found, so no UCX tuning
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  No hcoll found, no hcoll tuning
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Calling mpi_tune
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Found PMIv2 lib /usr/lib64/slurmpmi/libpmi.so from ['SLURM1', 'SYSTEM1', 'AUTOMATIC'] ({'SYSTEM2': '/usr/lib64/libpmi2.so', 'SLURM1': '/usr/lib64/slurmpmi/libpmi.so', 'SYSTEM1': '/usr/lib64/libpmi.so', 'AUTOMATIC': '/automatic/via/does/not/exist', 'SLURM2': '/usr/lib64/slurmpmi/libpmi2.so'})
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Set environment variable I_MPI_PMI_LIBRARY: /usr/lib64/slurmpmi/libpmi.so
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Calling mpi_pmi
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Calling mpi_debug
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd sched arguments ['--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020']
2020-01-16 13:41:50,414 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  SLURM env variables SLURM_CLUSTERS=skitty, SLURM_CLUSTER_NAME=skitty, SLURM_CONF=/etc/slurm/slurm.conf_skitty, SLURM_CPUS_ON_NODE=1, SLURM_CPU_BIND=quiet,mask_cpu:0x000000001, SLURM_CPU_BIND_LIST=0x000000001, SLURM_CPU_BIND_TYPE=mask_cpu:, SLURM_CPU_BIND_VERBOSE=quiet, SLURM_GTIDS=0, SLURM_JOBID=515337, SLURM_JOB_ACCOUNT=gvo00002, SLURM_JOB_CPUS_PER_NODE=1(x8), SLURM_JOB_GID=2543020, SLURM_JOB_ID=515337, SLURM_JOB_NAME=INTERACTIVE, SLURM_JOB_NODELIST=node3110.skitty.os,node3111.skitty.os,node3112.skitty.os,node3113.skitty.os,node3114.skitty.os,node3115.skitty.os,node3138.skitty.os,node3139.skitty.os, SLURM_JOB_NUM_NODES=8, SLURM_JOB_PARTITION=skitty, SLURM_JOB_QOS=normal, SLURM_JOB_UID=2543020, SLURM_JOB_USER=vsc43020, SLURM_LAUNCH_NODE_IPADDR=10.141.10.65, SLURM_LOCALID=0, SLURM_MEM_PER_NODE=1024, SLURM_NNODES=8, SLURM_NODEID=0, SLURM_NODELIST=node3110.skitty.os,node3111.skitty.os,node3112.skitty.os,node3113.skitty.os,node3114.skitty.os,node3115.skitty.os,node3138.skitty.os,node3139.skitty.os, SLURM_NPROCS=8, SLURM_NTASKS=8, SLURM_NTASKS_PER_NODE=1, SLURM_PRIO_PROCESS=0, SLURM_PROCID=0, SLURM_PTY_PORT=35646, SLURM_PTY_WIN_COL=132, SLURM_PTY_WIN_ROW=30, SLURM_SRUN_COMM_HOST=10.141.10.65, SLURM_SRUN_COMM_PORT=35647, SLURM_STEPID=0, SLURM_STEP_ID=0, SLURM_STEP_LAUNCHER_PORT=35647, SLURM_STEP_NODELIST=node3110.skitty.os,node3111.skitty.os,node3112.skitty.os,node3113.skitty.os,node3114.skitty.os,node3115.skitty.os,node3138.skitty.os,node3139.skitty.os, SLURM_STEP_NUM_NODES=8, SLURM_STEP_NUM_TASKS=8, SLURM_STEP_TASKS_PER_NODE=1(x8), SLURM_SUBMIT_DIR=/kyukon/home/gent/430/vsc43020/BBTest, SLURM_SUBMIT_HOST=gligar05.gastly.os, SLURM_TASKS_PER_NODE=1(x8), SLURM_TASK_PID=37338, SLURM_TOPOLOGY_ADDR=node3110.skitty.os, SLURM_TOPOLOGY_ADDR_PATTERN=node, SLURM_UMASK=0002, SLURM_WORKING_CLUSTER=skitty:master27.skitty.os:6817:8704:109
2020-01-16 13:41:50,415 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Got job info 8 nodes; with per node 1 cores, 1 ranks, 1024 mem, None gpus
2020-01-16 13:41:50,415 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Got mpi size info 8 nodes; with per node 1 cores, 1 ranks, 1024 mem, None gpus
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Unset environment variables SLURM_CPUS_ON_NODE=1 SLURM_CPU_BIND=quiet,mask_cpu:0x000000001 SLURM_CPU_BIND_LIST=0x000000001 SLURM_CPU_BIND_TYPE=mask_cpu: SLURM_CPU_BIND_VERBOSE=quiet SLURM_JOB_CPUS_PER_NODE=1(x8) SLURM_JOB_NUM_NODES=8 SLURM_LAUNCH_NODE_IPADDR=10.141.10.65 SLURM_MEM_PER_NODE=1024 SLURM_NNODES=8 SLURM_NODEID=0 SLURM_NPROCS=8 SLURM_NTASKS=8 SLURM_NTASKS_PER_NODE=1 SLURM_PRIO_PROCESS=0 SLURM_PROCID=0 SLURM_STEP_NUM_NODES=8 SLURM_STEP_NUM_TASKS=8 SLURM_STEP_TASKS_PER_NODE=1(x8) SLURM_TASKS_PER_NODE=1(x8) SLURM_TASK_PID=37338
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Got pmi cmd args ['--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024']
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd sizing arguments ['--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024']
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd environment arguments ['--export=ALL']
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Mapped PMI <class 'vsc.mympirun.pmi.pmi.PMIv2'> to flavour pmi2
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd mpi arguments ['--mpi=pmi2']
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd debug arguments []
2020-01-16 13:41:50,416 DEBUG      mypmirun.Coupler_IntelMPI_Slurm MainThread  Generated pmicmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2']
2020-01-16 13:41:50,420 DEBUG      mypmirun.RunNoShellAsyncLoopStdout MainThread  _popen_named_args {'executable': None, 'shell': False, 'stdout': -1, 'close_fds': True, 'stdin': -1, 'stderr': -2}
2020-01-16 13:41:50,423 DEBUG      mypmirun.RunNoShellAsyncLoopStdout MainThread  _init_input: process stdin closed
2020-01-16 13:41:51,625 DEBUG      mypmirun.RunNoShellAsyncLoopStdout MainThread  _wait_for_process: loop stopped after 1 iterations (ec 1 loop_continue True)

Usage: dcp [options] source target
       dcp [options] source ... target_dir

Options:
  -i, --input <file>  - read source list from file
  -p, --preserve      - preserve permissions, ownership, timestamps, extended attributes
  -s, --synchronous   - use synchronous read/write calls (O_DIRECT)
  -S, --sparse        - create sparse files when possible
  -v, --verbose       - verbose output
  -q, --quiet         - quiet output
  -h, --help          - print usage
For more information see https://mpifileutils.readthedocs.io.

srun: error: node3111.skitty.os: task 1: Exited with exit code 1
srun: error: node3112.skitty.os: task 2: Exited with exit code 1
srun: error: node3110.skitty.os: task 0: Exited with exit code 1
srun: error: node3115.skitty.os: task 5: Exited with exit code 1
srun: error: node3139.skitty.os: task 7: Exited with exit code 1
srun: error: node3114.skitty.os: task 4: Exited with exit code 1
srun: error: node3138.skitty.os: task 6: Exited with exit code 1
srun: error: node3113.skitty.os: task 3: Exited with exit code 1
2020-01-16 13:41:51,629 ERROR      mypmirun.RunNoShellAsyncLoopStdout MainThread  _post_exitcode: problem occured with cmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2', 'dcp', '-h']: (shellcmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2', 'dcp', '-h']) output
Usage: dcp [options] source target
       dcp [options] source ... target_dir

Options:
  -i, --input <file>  - read source list from file
  -p, --preserve      - preserve permissions, ownership, timestamps, extended attributes
  -s, --synchronous   - use synchronous read/write calls (O_DIRECT)
  -S, --sparse        - create sparse files when possible
  -v, --verbose       - verbose output
  -q, --quiet         - quiet output
  -h, --help          - print usage
For more information see https://mpifileutils.readthedocs.io.

srun: error: node3111.skitty.os: task 1: Exited with exit code 1
srun: error: node3112.skitty.os: task 2: Exited with exit code 1
srun: error: node3110.skitty.os: task 0: Exited with exit code 1
srun: error: node3115.skitty.os: task 5: Exited with exit code 1
srun: error: node3139.skitty.os: task 7: Exited with exit code 1
srun: error: node3114.skitty.os: task 4: Exited with exit code 1
srun: error: node3138.skitty.os: task 6: Exited with exit code 1
srun: error: node3113.skitty.os: task 3: Exited with exit code 1

2020-01-16 13:41:51,633 WARNING    mypmirun.Coupler_IntelMPI_Slurm MainThread  main: exitcode 1 > 0; cmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2', 'dcp', '-h']
2020-01-16 13:41:51,639 ERROR      mypmirun        MainThread  Main failed: main: exitcode 1 > 0; cmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2', 'dcp', '-h']
Traceback (most recent call last):
  File "/kyukon/home/apps/CO7/skylake-ib/software/vsc-mympirun/5.0.0/lib/python2.7/site-packages/vsc_mympirun-5.0.0-py2.7.egg/vsc/mympirun/main.py", line 120, in main
    instance.main()
  File "/kyukon/home/apps/CO7/skylake-ib/software/vsc-mympirun/5.0.0/lib/python2.7/site-packages/vsc_mympirun-5.0.0-py2.7.egg/vsc/mympirun/pmi/mpi.py", line 98, in main
    self.log.raiseException("main: exitcode %s > 0; cmd %s" % (exitcode, cmd))
  File "/kyukon/home/apps/CO7/skylake-ib/software/vsc-mympirun/5.0.0/lib/python2.7/site-packages/vsc_base-2.9.3-py2.7.egg/vsc/utils/fancylogger.py", line 333, in raiseException
    raise err
Exception: main: exitcode 1 > 0; cmd ['srun', '--chdir=/kyukon/scratch/gent/vo/000/gvo00002/vsc43020', '--nodes=8', '--ntasks=8', '--cpus-per-task=1', '--mem-per-cpu=1024', '--export=ALL', '--mpi=pmi2', 'dcp', '-h']
@boegel
Copy link
Member

boegel commented Jan 24, 2020

That's just because dcp -h exits with a non-zero exit code though...

There's nothing much we can do about this, other than ask the dcp developers to change this (-h or --help should not exit with non-zero exit code)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants