Skip to content

Commit

Permalink
Add shell wrapper around mpiexec (#386)
Browse files Browse the repository at this point in the history
* Add Julia wrapper around `mpiexec`

* Simplify extraction of project flag

Co-authored-by: Simon Byrne <simonbyrne@gmail.com>

* Move mpiexecjl script to bin/

* Use `mpiexecjl` in documentation

* Add tests for `mpiexecjl`

* Run `mpiexecjl` test only on Unix systems

* Fix error in `mpiexecjl` and improve test

* Make `mpiexecjl` a POSIX shell script instead of a Julia one

* Update bin/mpiexecjl

[skip ci]

Co-authored-by: Simon Byrne <simonbyrne@gmail.com>

* Print usage with `-h` and `--help` and with no non-project argument

* add shellcheck action

* disable shellcheck on variable

* Update bin/mpiexecjl

Co-authored-by: Simon Byrne <simonbyrne@gmail.com>

Co-authored-by: Simon Byrne <simonbyrne@gmail.com>
  • Loading branch information
giordano and simonbyrne committed May 14, 2020
1 parent b16fc92 commit c5ab8ac
Show file tree
Hide file tree
Showing 9 changed files with 175 additions and 2 deletions.
18 changes: 18 additions & 0 deletions .github/workflows/shellcheck.yml
@@ -0,0 +1,18 @@
name: shellcheck

on:
push:
branches:
- master
tags: '*'
pull_request:

jobs:
shellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: sudo apt install shellcheck
- name: Check scripts
run: shellcheck bin/*
64 changes: 64 additions & 0 deletions bin/mpiexecjl
@@ -0,0 +1,64 @@
#!/bin/sh
#
# Copyright (C) 2020 Simon Byrne, Mosè Giordano
# License is MIT "Expat"
#
### Commentary:
#
# Command line utility to call the `mpiexec` binary used by the `MPI.jl` version
# in the given Julia project. It has the same syntax as the `mpiexec` binary
# that would be called, with the additional `--project=...` flag to select a
# different Julia project.
#
# Examples of usage (the MPI flags available depend on the MPI implementation
# called):
#
# $ mpiexecjl --version
# $ mpiexecjl -n 40 julia mpi-script.jl
# $ mpiexecjl --project=my_experiment -n 80 --oversubscribe julia mpi-script.jl
#
### Code:

usage () {
echo "Usage: ${0} [--project=...] MPIEXEC_ARGUMENTS..."
echo "Call the mpiexec binary in the Julia environment specified by the --project option."
echo "If no project is specified, the MPI associated with the global Julia environment will be used."
echo "All other arguments are forwarded to mpiexec."
}

for arg; do
shift
case "${arg}" in
--project | --project=*)
PROJECT_ARG="${arg}"
;;
-h | --help)
usage
echo "Below is the help of the current mpiexec."
echo
set -- "${@}" "${arg}"
;;
*)
set -- "${@}" "${arg}"
;;
esac
done

if [ -z "${*}" ]; then
echo "ERROR: no arguments specified." 1>&2
echo
usage
exit 1
fi

# shellcheck disable=SC2016
SCRIPT='
using MPI
ENV["JULIA_PROJECT"] = dirname(Base.active_project())
mpiexec(exe -> run(`$exe $ARGS`))'

if [ -n "${PROJECT_ARG}" ]; then
julia "${PROJECT_ARG}" --color=yes --startup-file=no -q --compile=min -O0 -e "${SCRIPT}" -- "${@}"
else
julia --color=yes --startup-file=no -q --compile=min -O0 -e "${SCRIPT}" -- "${@}"
fi
2 changes: 1 addition & 1 deletion docs/make.jl
Expand Up @@ -25,7 +25,7 @@ for (example_title, example_md) in EXAMPLES
println(mdfile)

println(mdfile, "```")
println(mdfile, "> mpiexec -n 3 julia $example_jl")
println(mdfile, "> mpiexecjl -n 3 julia $example_jl")
cd(@__DIR__) do
write(mdfile, mpiexec(cmd -> read(`$cmd -n 3 $(Base.julia_cmd()) --project $example_jl`)))
end
Expand Down
35 changes: 34 additions & 1 deletion docs/src/configuration.md
Expand Up @@ -77,4 +77,37 @@ The test suite can also be modified by the following variables:

- `JULIA_MPIEXEC_TEST_ARGS`: Additional arguments to be passed to the MPI launcher for the tests only.
- `JULIA_MPI_TEST_ARRAYTYPE`: Set to `CuArray` to test the CUDA-aware interface with
[`CuArray`s](https://github.com/JuliaGPU/CuArrays.jl) buffers.
[`CuArray`s](https://github.com/JuliaGPU/CuArrays.jl) buffers.

## Julia wrapper for `mpiexec`

Since you can configure `MPI.jl` to use one of several MPI implementations, you
may have different Julia projects using different implementation. Thus, it may
be cumbersome to find out which `mpiexec` executable is associated to a specific
project. To make this easy, on Unix-based systems `MPI.jl` comes with a thin
project-aware wrapper around `mpiexec`, called `mpiexecjl`.

### Installation

You can install `mpiexecjl` with [`MPI.install_mpiexecjl()`](@ref). The default
destination directory is `joinpath(DEPOT_PATH[1], "bin")`, which usually
translates to `~/.julia/bin`, but check the value on your system. You can also
tell `MPI.install_mpiexecjl` to install to a different directory.

To quickly call this wrapper we recommend you to add the destination directory
to your [`PATH`](https://en.wikipedia.org/wiki/PATH_(variable)) environment
variable.

### Usage

`mpiexecjl` has the same syntax as the `mpiexec` binary that will be called, but
it takes in addition a `--project` option to call the specific binary associated
to the `MPI.jl` version in the given project. If no `--project` flag is used,
the `MPI.jl` in the global Julia environment will be used instead.

After installing `mpiexecjl` and adding its directory to `PATH`, you can run it
with:

```
$ mpiexecjl --project=/path/to/project -n 20 julia script.jl
```
1 change: 1 addition & 0 deletions docs/src/environment.md
Expand Up @@ -4,6 +4,7 @@

```@docs
mpiexec
MPI.install_mpiexecjl
```

## Enums
Expand Down
1 change: 1 addition & 0 deletions src/MPI.jl
Expand Up @@ -51,6 +51,7 @@ include("collective.jl")
include("topology.jl")
include("onesided.jl")
include("io.jl")
include("mpiexec_wrapper.jl")

include("deprecated.jl")

Expand Down
24 changes: 24 additions & 0 deletions src/mpiexec_wrapper.jl
@@ -0,0 +1,24 @@
"""
MPI.install_mpiexecjl(; command::String = "mpiexecjl",
destdir::String = joinpath(DEPOT_PATH[1], "bin"),
force::Bool = false, verbose::Bool = true)
Install the `mpiexec` wrapper to `destdir` directory, with filename `command`.
Set `force` to `true` to overwrite an existing destination file with the same
path. If `verbose` is `true`, the installation prints information about the
progress of the process.
"""
function install_mpiexecjl(; command::String = "mpiexecjl",
destdir::String = joinpath(DEPOT_PATH[1], "bin"),
force::Bool = false, verbose::Bool = true)
# Adapted from https://github.com/fredrikekre/jlpkg.
destdir = abspath(expanduser(destdir))
exec = joinpath(destdir, command)
if ispath(exec) && !force
error("file `$(exec)` already exists; use `MPI.install_mpiexecjl(force=true)` to overwrite.")
end
mkpath(destdir)
verbose && @info "Installing `$(command)` to `$(destdir)`..."
cp(joinpath(@__DIR__, "..", "bin", "mpiexecjl"), exec; force = force)
verbose && @info "Done!"
end
26 changes: 26 additions & 0 deletions test/mpiexecjl.jl
@@ -0,0 +1,26 @@
using Test, Pkg
using MPI

@testset "mpiexecjl" begin
mktempdir() do dir
# Install MPI locally, so that we can test the `--project` flag to
# `mpiexecjl`
Pkg.activate(dir)
Pkg.add("MPI")
# Test installation
@test_logs (:info, r"Installing") (:info, r"Done") MPI.install_mpiexecjl(; destdir = dir)
# Test a run of mpiexec
mpiexecjl = joinpath(dir, "mpiexecjl")
julia = joinpath(Sys.BINDIR, Base.julia_exename())
example = joinpath(@__DIR__, "..", "docs", "examples", "01-hello.jl")
@test success(`$(mpiexecjl) --project=$(dir) $(julia) --startup-file=no -q $(example)`)
# Test help messages
for help_flag in ("-h", "--help")
help_message = read(`$(mpiexecjl) --project=$(dir) --help`, String)
@test occursin(r"Usage:.*MPIEXEC_ARGUMENTS", help_message)
end
# Without arguments, or only with the `--project` option, the wrapper will fail
@test !success(`$(mpiexecjl) --project=$(dir)`)
@test !success(`$(mpiexecjl)`)
end
end
6 changes: 6 additions & 0 deletions test/runtests.jl
Expand Up @@ -6,6 +6,12 @@ if get(ENV,"JULIA_MPI_TEST_ARRAYTYPE","") == "CuArray"
using CuArrays
end

if Sys.isunix()
# This test doesn't need to be run with mpiexec. `mpiexecjl` is currently
# available only on Unix systems
include("mpiexecjl.jl")
end

args = Base.shell_split(get(ENV, "JULIA_MPIEXEC_TEST_ARGS", ""))

function runtests()
Expand Down

0 comments on commit c5ab8ac

Please sign in to comment.