You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was running the example scripts for HOOMD-Blue umbrella sampling and I am getting the following error.
******ubuntu-20.04.4.sif I have no name!@cs001 HOOMD]$ mpiexec --mca btl self -np 12 /path-to-SSAGES/ssages multiwalker_umbrella_input.json
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
Installing SSAGES HOOMD-blue hook.
HOOMD-blue v2.9.3 SINGLE MPI SSE SSE2
Compiled: 09/27/22
Copyright (c) 2009-2019 The Regents of the University of Michigan.
You are using HOOMD-blue. Please cite the following:
J A Anderson, J Glaser, and S C Glotzer. "HOOMD-blue: A Python package for
high-performance molecular dynamics and hard particle Monte Carlo
simulations", Computational Materials Science 173 (2020) 109363
Ranks 0-11: HOOMD-blue is running on the CPU
notice(2): -- Neighborlist exclusion statistics -- :
notice(2): Particles with 7 exclusions : 6
notice(2): Particles with 10 exclusions : 6
notice(2): Particles with 13 exclusions : 2
notice(2): Neighbors included by diameter : no
notice(2): Neighbors excluded when in the same body: no
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.******
Can I please know how I can resolve this? Please let me know if I need to provide anything else regarding this.
Thanking you,
Sanjib
The text was updated successfully, but these errors were encountered:
There's a problem specific to HOOMD-blue v2.9.3 that causes this error. The easiest fix is using another HOOMD-blue version. Versions v2.6.0 to v2.9.7 (except for v2.9.3) are compatible with SSAGES 0.9.3.
Hi,
I was running the example scripts for HOOMD-Blue umbrella sampling and I am getting the following error.
******ubuntu-20.04.4.sif I have no name!@cs001 HOOMD]$ mpiexec --mca btl self -np 12 /path-to-SSAGES/ssages multiwalker_umbrella_input.json
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
Installing SSAGES HOOMD-blue hook.
HOOMD-blue is running on the CPU
HOOMD-blue is running on the CPU
Installing SSAGES HOOMD-blue hook.
HOOMD-blue v2.9.3 SINGLE MPI SSE SSE2
Compiled: 09/27/22
Copyright (c) 2009-2019 The Regents of the University of Michigan.
You are using HOOMD-blue. Please cite the following:
high-performance molecular dynamics and hard particle Monte Carlo
simulations", Computational Materials Science 173 (2020) 109363
Ranks 0-11: HOOMD-blue is running on the CPU
notice(2): -- Neighborlist exclusion statistics -- :
notice(2): Particles with 7 exclusions : 6
notice(2): Particles with 10 exclusions : 6
notice(2): Particles with 13 exclusions : 2
notice(2): Neighbors included by diameter : no
notice(2): Neighbors excluded when in the same body: no
MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.******
Can I please know how I can resolve this? Please let me know if I need to provide anything else regarding this.
Thanking you,
Sanjib
The text was updated successfully, but these errors were encountered: