Skip to content
This repository has been archived by the owner on Apr 7, 2022. It is now read-only.

GridapDistributed Julia wrappers for Petsc library 🚧 work in progress 🚧

License

Notifications You must be signed in to change notification settings

gridap/GridapDistributedPETScWrappers.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status Codecov

GridapDistributedPETScWrappers

GridapDistributed.jl wrappers for the PETSc library. 🚧 work in progress 🚧

Purpose

This package is currently experimental, under development. In any case, we warn that the package is not though to be a fully-functional PETSc wrapper written in Julia (for that purpose we refer to the PETSc.jl package, which is recently being revamped). Instead, it provides sufficient (although not necessarily necessary) functionality from PETSc as per-required by GridapDistributed.jl. Given that the latter is still under development, GridapDistributedPETScWrappers.jl may also vary accordingly to the changing requirements of GridapDistributed.jl. Once we have a more clear/definite understanding of what GridapDistributed.jl requires from PETSc, we may eventually significantly cut down the current code of GridapDistributedPETScWrappers.jl.

The development of this package started originally from JuliaParallel's org PETSc.jl (in particular, from commit https://github.com/JuliaParallel/PETSc.jl/commit/3d8c46a127821aa1ff20d5892f50ec75be11c77f, uptodate branch). More information can be found in the following issue: gridap/GridapDistributed.jl#22. Not all code which is currently in GridapDistributedPETScWrappers.jl is functional. In principle, one can safely use all the the machinery being tested in test/runtests.jl, although other parts may also be functional as well.

Requirements, limitations, warnings

  • PETSc version >= v3.10.3 REQUIRED.

    From this commit of PETSc (petsc/petsc@2ebc710#diff-d46e9870b0b2f6361c8563135bfdaa89eab41a56290d02afb6ca42f5463ea629), the value of PETSC_INT changed from 0 to 16. This has implications on the PETSc julia wrappers, that have to define the associated constant accordingly. Accordingly to PETSc release dates, this change is reflected from v3.10.3 on.

  • We currently only support PETSc compiled with PetscScalar==double and PetscReal==double (i.e., Julia's Float64). This is referred to as RealDouble within GridapDistributedPETScWrappers.jl. The version of PETSc.jl from which we started also supported RealSingle and ComplexDouble, although no efforts have been spent into supporting these back. On the other hand, either 32-bit or 64-bit integer compilations of PETSc are allowed. The package automatically detects during cache module pre-compilation which is the size of PetscInt.

  • All finalizers of Julia types wrapping PETSc ones are deactivated. Thus, the latter ones are not destroyed when the former ones are GC'ed. The user may explicitly destroy the latter ones calling PetscDestroy. The user may activate finalizers setting the package-wide constant deactivate_finalizers to false, although this is not recommended because of two reasons, which, to be honest, I do not fully understand:

    1. Tests fail when finalizers are activated, because these cause an MPI call to be triggered after MPI_Finalize (could not understand why this is the case)
    2. Quoting from a PETSc.jl dev doc file: "We can't attach finalizers for distributed objects (i.e. VecMPI), as destroy needs to be called collectively on all MPI ranks." I guess that the GC may not ensure the same order of execution for all MPI tasks, causing deadlocks and other sort of issues.

Installation, usage instructions

GridapDistributedPETScWrappers.jl uses, among others, the MPI.jl Julia package; see configuration documentation for this package available here.

There are essentially two possible ways to build GridapDistributedPETScWrappers.jl (i.e., pkg> build GridapDistributedPETScWrappers):

  1. One wants to use MPI+PETSc libraries pre-compiled in Julia registry packages (this is the typical case when one wants to use this package on your local computer). In this case one has to ensure that both JULIA_MPI_BINARY and JULIA_PETSC_RealDouble_BINARY are either unset or set to the empty string.

  2. One wants to use a PETSc library already installed on the system (typically this is the case one is on a HPC cluster). In this case one has to ensure that MPI.jl is built such that it uses the same MPI library this installation of the PETSc library is compiled/linked with (see MPI.jl instructions referred above). The following environment variables are used to configure how GridapDistributedPETScWrappers.jl is built:

    • JULIA_PETSC_RealDouble_BINARY has to be set to "system".
    • JULIA_PETSC_RealDouble_DIR has to be set to PETSc's DIR.
    • JULIA_PETSC_RealDouble_ARCH has to be set to PETSc's ARCH.
    • JULIA_PETSC_RealDouble_LIBNAME may optionally be set to the name of PETSc's dynamic library file of the system installation of PETSc (libpetsc is used otherwise by default).

About

GridapDistributed Julia wrappers for Petsc library 🚧 work in progress 🚧

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 100.0%