Message Passing Interface (MPI) is a standard for parallel programming in distributed
computing environments. This tutorial will guide you through using vsl.mpi
, a V package
that provides bindings to MPI for parallel computing.
Before you begin, make sure you have the following installed:
-
OpenMPI library:
-
Ubuntu:
sudo apt install libopenmpi-dev
-
Arch Linux:
sudo pacman -S openmpi
-
macOS:
brew install openmpi
-
Windows:
choco install openmpi
-
FreeBSD:
pkg install openmpi
-
v install vsl
you can check out the official documentation of vsl for more information.
Compile the MPI example with:
v -o mpi_example -prod -cc mpirun main.v
Run the example with:
mpirun -np 2 -H localhost:8 ./mpi_example
You should see output from both ranks indicating successful MPI communication.
Now, let's dive into the example code to understand how MPI is used with vsl.mpi
.
import vsl.float.float64
import vsl.mpi
mpi.initialize()!
defer {
mpi.finalize()
}
if mpi.world_rank() == 0 {
println('Test MPI 01')
}
println('Hello from rank ${mpi.world_rank()}')
println('The world has ${mpi.world_size()} processes')
n := 11
mut x := []f64{len: n}
id, sz := mpi.world_rank(), mpi.world_size()
start, endp1 := (id * n) / sz, ((id + 1) * n) / sz
for i := start; i < endp1; i++ {
x[i] = f64(i)
}
// ... (MPI functions)
mpi.initialize()!
: Initialize MPI. The!
asserts that the initialization is successful.mpi.finalize()
: Cleanup and finalize MPI at the end of the program.mpi.world_rank()
: Get the rank of the current process.mpi.world_size()
: Get the total number of processes.mpi.Communicator.new([])
: Create a new communicator.
Now, let's dive into the MPI functions used in the example.
Synchronize all processes:
comm := mpi.Communicator.new([])!
comm.barrier()
Broadcast values from the root process to all processes:
comm.bcast_from_root_f64(vals)
Sum array elements on rank 0 and store the result in dest
:
comm.reduce_sum_f64(mut dest, orig)
Sum array elements across all processes and store the result in dest
:
comm.all_reduce_sum_f64(mut dest, orig)
Compile and run the example as described earlier. You should see output from both ranks, indicating successful MPI communication.
Congratulations! You've completed the MPI with vsl.mpi
tutorial. This introduction
should help you get started with parallel programming in V using MPI. For more detailed
information, refer to the official documentation of vsl.mpi.
Feel free to explore additional MPI functions and experiment with different parallel algorithms to harness the full power of MPI in your V programs.
Happy parallel programming!