Skip to content

comp-astro-hpc/mpi_demos

Repository files navigation

mpi_demos

Demonstration of basic MPI functions with mpi4py

Scripts

  • boiler_plate.py → the template for MPI scripts; MPI.Initialize is automatically called when one imports mpi4py; MPI.Finalize is called by default in the python codes, but is added here just for demonstration
  • hello_world.py → a simple send/receive operation in parallel
  • broadcast.py → communicating a dataset/dictionary that exists only on master (rank = 0) to all other processes; also try MPI.Barrier
  • integrate/trapezoidal.py → script for the trapezoidal calculation for a given function func, within [a,b] with n points
  • integrate/mpi_integrate_send_recv.py → parallelly compute the trapezoidal integral and communicate by send/receive operation for final estimate
  • integrate/mpi_integrate_reduce.py &arr; improve the send/receive operation in integrate/mpi_integrate_send_recv.py file by using MPI.Reduce
  • domain_decomp/grid-hopping.py &arr; a simple demonstration of domain decomposition

Creating a virtual environment

We recommend creating a virtual environment for the purpose of the demonstrations in this workshop. You can either create a conda virtual env or a native python one. If you have conda installed, use 1:

conda create -n <venv_name> python=3.10
## Follow onscreen instructions to download necessary packages (if needed)
conda activate <venv_name> 

Use conda deactivate to move out of the virtual environment.

If you only have Python and not the conda environment (requires Python version >=3.6), then try 2:

python -m venv <venv_name>
source ./<venv_name>/bin/activate

Type deactivate to come out of the virtual environment.

Installing mpi4py inside the virtual environment 3

Use either:

pip install --upgrade pip
pip install mpi4py

or,

conda install -c conda-forge mpi4py

Instructions

  • In the file hello_world.py add a simple send/receive operation, where all ranks send a message to rank 0 Run using :

    mpirun -n <nproc> python hello_world.py
    
  • In the file broadcast.py create a dataset/dictionary that exists only on rank = 0 and broadcast that to all other

    mpirun -n <nproc> python broadcast.py
    
  • In the integrate folder, go through the script for the trapezoidal calculation in file trapezoidal.py, to look at the general implementation of a trapezoidal function.

  • Move on to the file mpi_integrate_send_recv.py for parallel computation through MPI send/receive operation. Run using :

    mpirun -n 5 python mpi_integrate_send_recv.py
    
  • Update the file mpi_integrate_reduce.py to use the global operation MPI.Reduce.

  • In the folder domain_decomp, the file grid-hopping.py demonstrates a simple domain decomposition. Run using :

    mpirun -n 2 python grid-hopping.py
    

Resources

Footnotes

Footnotes

  1. https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html

  2. https://docs.python.org/3/library/venv.html

  3. https://mpi4py.readthedocs.io/en/stable/

About

Demonstration of basic MPI functions with `mpi4py`

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages