Skip to content

Commit

Permalink
Updated parallel computing MPI examples
Browse files Browse the repository at this point in the history
  • Loading branch information
pkrastev committed Apr 23, 2024
1 parent 28c9726 commit 7926ebf
Show file tree
Hide file tree
Showing 18 changed files with 397 additions and 38 deletions.
15 changes: 8 additions & 7 deletions Parallel_Computing/MPI/Example1/README.md
Expand Up @@ -111,31 +111,32 @@ end program pi_monte_carlo
#SBATCH -J pi_monte_carlo
#SBATCH -o pi_monte_carlo.out
#SBATCH -e pi_monte_carlo.err
#SBATCH -p rocky
#SBATCH -p test
#SBATCH -t 30
#SBATCH -n 8
#SBATCH --mem-per-cpu=4000

# Load required modules
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01

# Run program
srun -n 8 --mpi=pmix ./pi_monte_carlo.x
srun -n $SLURM_NTASKS --mpi=pmix ./pi_monte_carlo.x
```

### Example Usage:

```bash
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01
make
sbatch run.sbatch
```

### Example Output:

```bash
> cat pi_monte_carlo.out
Exact PI: 3.14159265
Computed PI: 3.14156124
Error: 0.00100%
Total time: 14.72 sec
Computed PI: 3.14159179
Error: 0.00003%
Total time: 31.00 sec
```
4 changes: 4 additions & 0 deletions Parallel_Computing/MPI/Example1/pi_monte_carlo.out
@@ -0,0 +1,4 @@
Exact PI: 3.14159265
Computed PI: 3.14159179
Error: 0.00003%
Total time: 31.00 sec
6 changes: 3 additions & 3 deletions Parallel_Computing/MPI/Example1/run.sbatch
Expand Up @@ -2,13 +2,13 @@
#SBATCH -J pi_monte_carlo
#SBATCH -o pi_monte_carlo.out
#SBATCH -e pi_monte_carlo.err
#SBATCH -p rocky
#SBATCH -p test
#SBATCH -t 30
#SBATCH -n 8
#SBATCH --mem-per-cpu=4000

# Load required modules
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01

# Run program
srun -n 8 --mpi=pmix ./pi_monte_carlo.x
srun -n $SLURM_NTASKS --mpi=pmix ./pi_monte_carlo.x
2 changes: 1 addition & 1 deletion Parallel_Computing/MPI/Example2/Makefile
@@ -1,5 +1,5 @@
#==========================================================
# Make file for pi_monte_carlo.f90
# Make file
#==========================================================
F90CFLAGS = -c -O2
F90COMPILER = mpif90
Expand Down
9 changes: 5 additions & 4 deletions Parallel_Computing/MPI/Example2/README.md
Expand Up @@ -86,28 +86,29 @@ clean :
#SBATCH -J ptrap
#SBATCH -o ptrap.out
#SBATCH -e ptrap.err
#SBATCH -p rocky
#SBATCH -p test
#SBATCH -t 30
#SBATCH -n 8
#SBATCH --mem-per-cpu=4000

# Load required modules
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01

# Run program
srun -n 8 --mpi=pmix ./ptrap.x
srun -n $SLURM_NTASKS --mpi=pmix ./ptrap.x
```

### Example Usage:

```bash
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01
make
sbatch run.sbatch
```

### Example Output:

```
> cat ptrap.out
Integral from 0.0 to 4.0 is 21.3350
```
Binary file removed Parallel_Computing/MPI/Example2/nodeinfo.mod
Binary file not shown.
Binary file removed Parallel_Computing/MPI/Example2/ptrap.o
Binary file not shown.
1 change: 1 addition & 0 deletions Parallel_Computing/MPI/Example2/ptrap.out
@@ -0,0 +1 @@
Integral from 0.0 to 4.0 is 21.3350
Binary file removed Parallel_Computing/MPI/Example2/ptrap.x
Binary file not shown.
6 changes: 3 additions & 3 deletions Parallel_Computing/MPI/Example2/run.sbatch
Expand Up @@ -2,13 +2,13 @@
#SBATCH -J ptrap
#SBATCH -o ptrap.out
#SBATCH -e ptrap.err
#SBATCH -p rocky
#SBATCH -p test
#SBATCH -t 30
#SBATCH -n 8
#SBATCH --mem-per-cpu=4000

# Load required modules
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01

# Run program
srun -n 8 --mpi=pmix ./ptrap.x
srun -n $SLURM_NTASKS --mpi=pmix ./ptrap.x
9 changes: 5 additions & 4 deletions Parallel_Computing/MPI/Example3/README.md
Expand Up @@ -40,29 +40,30 @@ clean :
#SBATCH -J planczos
#SBATCH -o planczos.out
#SBATCH -e planczos.err
#SBATCH -p rocky
#SBATCH -p test
#SBATCH -t 30
#SBATCH -n 8
#SBATCH --mem-per-cpu=4000

# Load required modules
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01

# Run program
srun -n 8 --mpi=pmix ./planczos.x
srun -n $SLURM_NTASKS --mpi=pmix ./planczos.x
```

### Example usage:

```bash
module load intel/23.0.0-fasrc01 openmpi/4.1.4-fasrc01
module load intel/24.0.1-fasrc01 openmpi/5.0.2-fasrc01
make
sbatch
```

### Example Output:

```bash
> cat planczos.out
5 lowest eigenvalues - Lanczos, exact
iteration: 1
1 49.8653454477317 50.0109460873557
Expand Down

0 comments on commit 7926ebf

Please sign in to comment.