-
Notifications
You must be signed in to change notification settings - Fork 230
[Manual] Devito on CX3 @Imperial College HPC
George Bisbas edited this page Jul 30, 2024
·
4 revisions
Useful: HPC pages:
- https://wiki.imperial.ac.uk/display/HPC/High+Performance+Computing
- https://wiki.imperial.ac.uk/display/HPC/MPI+Jobs
- https://icl-rcs-user-guide.readthedocs.io/en/latest/hpc/queues/classes-of-jobs/
- https://icl-rcs-user-guide.readthedocs.io/en/latest/hpc/queues/mpi-jobs/
Job sizing guidance
# Do `ssh` to your login node (Usually logs to AMD EPYC 7742 64-Core Processor)
ssh [email protected]
# Do `ssh` to your login node (Usually logs to Intel(R) Xeon(R) Platinum 8358 CPU @ 2.60GHz)
ssh -oPubkeyAuthentication=no [email protected]
# Check loaded modules, you should have tools/prod
module list
# If not load `module load tools/prod`
# module load Python/3.11.2-GCCcore-12.2.0-bare
module load Python/3.11.5-GCCcore-13.2.0
# Load OpenMPI
# module load tools/dev
module load OpenMPI/4.1.6-GCC-13.2.0 # Necessary for installing reqs-mpi
While these nodes will run software built using the Intel MPI and Compiler, we strongly recommend rebuilding your software using GCC and OpenMPI as we observe greater performance with these.
# Other useful modules, e.g. git
module load git
# If everything went fine, you should be able to run a typical operator. i.e.:
source ../../environments/python3.11.5-env/bin/activate
export DEVITO_LOGGING=DEBUG
export DEVITO_LANGUAGE=openmp
python examples/seismic/acoustic/acoustic_example.py
For MPI:
# Instal MPI requirements
python3 -m pip install --force-reinstall --upgrade --no-cache-dir -r requirements-mpi.txt
# Do a test run, it should work
DEVITO_MPI=1 mpirun -n 2 python3 examples/seismic/acoustic/acoustic_example.py
For interactive jobs:
qsub -I -l select=2:ncpus=16:mem=8gb:mpiprocs=2:ompthreads=8 -l walltime=02:00:00
module load Python/3.11.5-GCCcore-13.2.0
module load OpenMPI/4.1.6-GCC-13.2.0
#!/bin/bash
#PBS -lselect=1:ncpus=24:mem=120gb:mpiprocs=2:ompthreads=12
#PBS -lwalltime=00:30:00
lscpu
whoami
cd /rds/general/user/$(whoami)/home/devitocodes/devito
export DEVITO_HOME=/rds/general/user/$(whoami)/home/devitocodes/devito
module load tools/prod
module load Python/3.11.5-GCCcore-13.2.0
module load OpenMPI/4.1.6-GCC-13.2.0
source ../../environments/python3.11.5-env/bin/activate
# module load intel-compilers/2022.2.1
# module load mpi
# export DEVITO_ARCH=intel
export DEVITO_LANGUAGE=openmp
export DEVITO_LOGGING=DEBUG
export DEVITO_MPI=1
# export TMPDIR=/rds/general/user/$(whoami)/home/devitocodes/cache
mpiexec python3 $DEVITO_HOME/examples/seismic/acoustic/acoustic_example.py d 100 100 100 -so 8 --tn 200 --autotune aggressive
For Rome:
#!/bin/bash
#PBS -lselect=1:ncpus=128:mem=100gb:mpiprocs=8:ompthreads=16
#PBS -lwalltime=02:00:00
lscpu
whoami
cd /rds/general/user/$(whoami)/home/devitocodes/devito
export DEVITO_HOME=/rds/general/user/$(whoami)/home/devitocodes/devito
module load tools/prod
module load Python/3.11.2-GCCcore-12.2.0-bare
source environments/python311-env/bin/activate
module load intel-compilers/2022.2.1
module load mpi
export DEVITO_ARCH=gcc
export DEVITO_LANGUAGE=openmp
export DEVITO_LOGGING=DEBUG
export DEVITO_MPI=1
# export TMPDIR=/rds/general/user/$(whoami)/home/devitocodes/cache
mpiexec python3 $DEVITO_HOME/examples/seismic/acoustic/acoustic_example.py d 100 100 100 -so 8 --tn 200 --autotune aggressive
qsub -I -l select=1:ncpus=16:mem=96gb:mpiprocs=1:ngpus=1:gpu_type=RTX6000 -l walltime=02:00:00