Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 23 additions & 1 deletion docs/services/jupyterlab.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ A kernel can be created from an active Python virtual environment with the follo
python -m ipykernel install --user --name="<kernel-name>" --display-name="<kernel-name>"
```

## Using uenvs in JupyterLab
## Using uenvs in JupyterLab for Python

In the JupyterHub Spawner Options form mentioned above, it's possible to pass an uenv and a view.
The uenv will be mounted at `/user-environment`, and the specified view will be activated.
Expand All @@ -65,6 +65,28 @@ Then with that virtual environment activated, you can run the command to create
If the uenv is not present in the local repository, it will be automatically fetched.
As a result, JupyterLab may take slightly longer than usual to start.


## Using Julia in JupyterHub

Each time you start a JupyterHub server, you need to do the following in the JupyterHub Spawner Options form mentioned above:
!!! important "pass a [`julia`][ref-uenv-julia] uenv and the view `jupyter`."

At first time use of Julia within Jupyter, IJulia and one or more Julia kernel needs to be installed.
Type the following command in a shell within JupyterHub to install IJulia, the default Julia kernel and, on systems whith Nvidia GPUs, a Julia kernel running under Nvidia Nsight Systems:
```console
install_ijulia
```

You can install additional custom Julia kernels by typing the following in a shell:
```console
julia
using IJulia
installkernel(<args>) # type `? installkernel` to learn about valid `<args>`
```

!!! warning "First time use of Julia"
If you are using Julia for the first time at all, executing `install_ijulia` will automatically first trigger the installation of `juliaup` and the latest `julia` version (it is also triggered if you execute `juliaup` or `julia`).

## Ending your interactive session and logging out

The Jupyter servers can be shut down through the Hub. To end a JupyterLab session, please select `Control Panel` under the `File` menu and then `Stop My Server`. By contrast, clicking `Logout` will log you out of the server, but the server will continue to run until the Slurm job reaches its maximum wall time.
Expand Down
4 changes: 4 additions & 0 deletions docs/software/prgenv/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,10 @@ CSCS provides "programming environments" on Alps vClusters that provide compiler

Provides compilers, MPI and Python, along with linear algebra and mesh partitioning libraries for a broad range of use cases.

- :fontawesome-solid-layer-group: [__julia__][ref-uenv-julia]

Provides a complete HPC setup for running Julia efficiently at scale, using the supercomputer hardware optimally.

- :fontawesome-solid-layer-group: [__Cray Programming Environment__][ref-cpe]

The Cray Programming Environment (CPE) is a suite of compilers, libraries and tools provided by HPE.
Expand Down
93 changes: 93 additions & 0 deletions docs/software/prgenv/julia.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
[](){#ref-uenv-julia}
# julia

The `julia` uenv provides a complete HPC setup for running Julia efficiently at scale, using the supercomputer hardware optimally.
Unlike in traditional approaches, this Julia HPC setup enables you to update Julia yourself using the included preconfigured community tool [`juliaup`](https://github.com/JuliaLang/juliaup).
It also does not preinstall any packages site-wide. Instead, for HPC key packages that benefit from using locally built libraries (`MPI.jl`, `CUDA.jl`, `AMDGPU.jl`, `HDF5.jl`, `ADIOS2.jl`, etc.), this uenv provides the libraries and presets package preferences and environment variables for an automatic optimal installation and usage of these packages using these local libraries.
As a result, you only need to type, e.g., `] add CUDA` in the Julia REPL, in order to install `CUDA.jl` optimally.
The `julia` uenv internally relies on the community scripting project [JUHPC](https://github.com/JuliaParallel/JUHPC) to achieve this.

## Versioning

The naming scheme is `julia/<version>`, where `<version>` has the `YY.M[M]` format, for example September 2024 is `24.9`, and May 2025 would be `25.5`.
The release schedule is not fixed; new versions will be released, when there is a compelling reason to update.

| version | node types | system |
|-----------|-----------|--------|
| 24.9 | gh200, zen2 | daint, eiger, todi |
| 25.5 | gh200, zen2 | daint, eiger, santis, clariden, bristen |

=== "25.5"

The key updates in version `25.5:v1` from the version `24.9` were:

* enabling compatibility with the latest `uenv` version `8.0`
* changing the installation directory
* adding the `jupyter` view
* upgrading to `cuda@12.8` and `cray-mpich@8.1.30`

!!! info "HPC key libraries included"
* cray-mpich/8.1.30
* cuda/12.8.0
* hdf5/1.14.5
* adios2/2.10.2

## How to use

Find and pull a Julia uenv image, e.g.:
```bash
uenv image find julia # list available julia images
uenv image pull julia/25.5 # copy version[:tag] from the list above
```

Start the image and activate the Julia[up] HPC setup by loading the following view(s):
=== "`juliaup`"
!!! example ""
```bash
uenv start julia/25.5:v1 --view=juliaup
```

=== "`juliaup` and `modules`"
!!! example "This activates also modules for the available libraries like, e.g, `cuda`."
```bash
uenv start julia/25.5:v1 --view=juliaup,modules
```

There is also a view `jupyter` available, which is required for [using Julia in JupyterHub][using-julia-in-jupyterhub].

!!! info "Automatic installation of Juliaup and Julia"
The installation of `juliaup` and the latest `julia` version happens automatically the first time when `juliaup` is called:
```bash
juliaup
```

Note that the `julia` uenv is built extending the `prgenv-gnu` uenv.
As a result, it provides also all the features of `prgenv-gnu`.
Please see [the `prgenv-gnu` documentation][ref-uenv-prgenv-gnu-how-to-use] for details.
You can for example load the `modules` view to see the exact versions of the libraries available in the uenv.

## Background on Julia for HPC

[Julia](https://julialang.org/) is a programming language that was designed to solve the "two-language problem", the problem that prototypes written in an interactive high-level language like MATLAB, R or Python need to be partly or fully rewritten in lower-level languages like C, C++ or Fortran when a high-performance production code is required.
Julia, which has its origins at MIT, can however reach the performance of C, C++ or Fortran despite being high-level and interactive.
This is possible thanks to Julia's just-ahead-of-time compilation: code can be executed in an interactive shell as usual for prototyping languages, but functions and code blocks are compiled to machine code right before their first execution instead of being interpreted (note that modules are pre-compiled).

Julia is optimally suited for parallel computing, supporting, e.g., MPI (via [`MPI.jl`](https://github.com/JuliaParallel/MPI.jl)) and threads similar to OpenMP.
Moreover, Julia's GPU packages ([`CUDA.jl`](https://github.com/JuliaGPU/CUDA.jl), [`AMDGPU.jl`](https://github.com/JuliaGPU/AMDGPU.jl), etc.) enables writing native Julia code for GPUs [1], which can reach similar efficiency as CUDA C/C++ [2] or the analog for other vendors.
Julia was shown to be suitable for scientific GPU supercomputing at large scale, enabling near optimal performance and nearly ideal scaling on thousands of GPUs on Piz Daint [2,3,4,5].
Packages like [ParallelStencil.jl](https://github.com/omlins/ParallelStencil.jl) [[4](https://doi.org/10.21105/jcon.00138)] and [ImplicitGlobalGrid.jl](https://github.com/eth-cscs/ImplicitGlobalGrid.jl) [[3](https://doi.org/10.21105/jcon.00137)] enable to unify prototype and high-performance production code in one single codebase.
Furthermore, Julia permits direct calling of C/C++ and Fortran libraries without glue code.
It also features similar interfaces to prototyping languages as, e.g., Python, R and MATLAB.
Finally, the [Julia PackageCompiler](https://github.com/JuliaLang/PackageCompiler.jl) enables to compile Julia modules in order to create shared libraries that are callable from C or other languages (a comprehensive [Proof of Concept](https://github.com/omlins/libdiffusion) was already available in 2018 and the PackageCompiler has matured very much since).

## References

[1] Besard, T., Foket, C., & De Sutter B. (2018). Effective Extensible Programming: Unleashing Julia on GPUs. IEEE Transactions on Parallel and Distributed Systems, 30(4), 827-841

[2] Räss, L., Omlin, S., & Podladchikov, Y. Y. (2019). Porting a Massively Parallel Multi-GPU Application to Julia: a 3-D Nonlinear Multi-Physics Flow Solver. JuliaCon Conference, Baltimore, US.

[3] Omlin, S., Räss, L., Utkin I. (2024). Distributed Parallelization of xPU Stencil Computations in Julia. The Proceedings of the JuliaCon Conferences, 6(65), 137, https://doi.org/10.21105/jcon.00137

[4] Omlin, S., Räss, L. (2024). High-performance xPU Stencil Computations in Julia. The Proceedings of the JuliaCon Conferences, 6(64), 138, https://doi.org/10.21105/jcon.00138

[5] Omlin, S., Räss, L., Kwasniewski, G., Malvoisin, B., & Podladchikov, Y. Y. (2020). Solving Nonlinear Multi-Physics on GPU Supercomputers with Julia. JuliaCon Conference, virtual.
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ nav:
- 'prgenv-gnu': software/prgenv/prgenv-gnu.md
- 'prgenv-nvfortran': software/prgenv/prgenv-nvfortran.md
- 'linalg': software/prgenv/linalg.md
- 'julia': software/prgenv/julia.md
- 'Cray modules (CPE)': software/prgenv/cpe.md
- 'Machine Learning':
- software/ml/index.md
Expand Down