diff --git a/.gitignore b/.gitignore index 632ce84ad8..719ebdbb1b 100644 --- a/.gitignore +++ b/.gitignore @@ -152,3 +152,6 @@ Temporary Items # jupyter .ipynb_checkpoints + +# VSCode +.vscode diff --git a/CHANGELOG.md b/CHANGELOG.md new file mode 100644 index 0000000000..869e6cbdb3 --- /dev/null +++ b/CHANGELOG.md @@ -0,0 +1,230 @@ +# Changelog + +:arrow_left: [Back to main page](./README.md) + +- October 7, 2022: Version 5 comes with many changes, both visible and + invisible ones. Some of those break the existing API, but if you are + using tests, you should be fine. Major changes include: + + - **New convergence controllers**: Checking whether a step has + converged can be tricky, so we made separate modules out of + these checks. This makes features like adaptivity easier to + implement. Also, the controllers have been streamlined a bit to + make them more readable/digestible. Thanks to + [\@brownbaerchen](https://github.com/brownbaerchen)! + - **Adaptivity and error estimators**: SDC now comes with + adaptivity and error estimation, leveraging the new convergence + controllers out of the box. Thanks to + [\@brownbaerchen](https://github.com/brownbaerchen)! + - **New collocation classes**: We completely rewrote the way + collocation nodes and weights are computed. It is now faster, + more reliable, shorter, better. But: this **breaks the API**, + since the old collocation classes do not exist anymore. The + projects, tutorials, tests and most of the playgrounds have been + adapted, so have a look over there to see [what to + change](https://github.com/Parallel-in-Time/pySDC/commit/01ffabf71a8d71d33b74809271e8ad5a7b03ac5e#diff-adf74297b6c64d320f4da0f1d5528eda6229803a6615baf5d54c418032543681). + Thanks to [\@tlunet](https://github.com/tlunet)! + - **New projects**: Resilience and energy grid simulations are + ready to play with and are waiting for more ideas! We used this + effort to condense and clean up the problem classes a bit, + reducing the number of files and classes with only marginal + differences significantly. This could potentially **break your + code**, too, if you rely on any of those affected ones. Thanks + to [\@brownbaerchen](https://github.com/brownbaerchen) and + [\@lisawim](https://github.com/lisawim)! + - **Toward GPU computing**: We included a new data type based on + [CuPy](https://cupy.dev/) making GPU computing possible. Thanks + to [\@timo2705](https://github.com/timo2705)! + - **Better testing**: The CI pipeline got a complete overhaul + (again), now enabling simultaneous tests, faster/earlier + linting, benchmarking (at least, in principal), separate + environments and so on. The code is tested under Ubuntu and + MacOS. + - **Better code formatting**: `pySDC` now uses + [black](https://black.readthedocs.io) and + [flakeheaven](https://flakeheaven.readthedocs.io) for cleaner + source code. After complaints here and there about linting + \"errors\" the recommended way now is to run `black` before + submission. + +- December 13, 2021: Version 4.2 brings compatibility with Python 3.9, + including some code cleanup. The CI test suite seems to run faster + now, since sorting out the dependencies is faster. Tested + [mamba](https://github.com/mamba-org/mamba), which for now makes the + CI pipeline much faster. Also, the CI workflow can now run locally + using [act](https://github.com/nektos/act). We introduced markers + for soem of the tests in preparation of distributed tests on + different platforms. And finally, a LaTeX installation is no longer + needed use plotting (but recommended). + +- August 11, 2021: Version 4.1 has some more changes under the hood, + most of them with no significant impact to users. The CI pipeline + has been completely rewritten, porting the code to [Github + Actions](https://github.com/features/actions) (away from [Travis + CI](https://travis-ci.com/)), to [flake8](https://flake8.pycqa.org) + and to [pytest](https://pytest.org) (away from + [nose](https://nose.readthedocs.io)). One thing that may have an + impact on users is that following the changes made in Version 4.0, + the PETSc data structures are now much easier, removing a lot of + unnecessary boilerplate code. + +- May 4, 2021: Long time, no see, but this major release 4.0 marks + some improvements under the hood: + + - **Rewrote `mesh` and `particle` data type**: + Creation of new arrays for each operation is now avoided by + directly subclassing Numpy\'s `ndarray`. Somewhat faster, + definitively better, less code, future-proof, but also breaking + the API. If you use `pySDC` for your project, make + sure you adapt to the new data types (or don\'t upgrade). + - **Faster quadrature**: Thanks to + [tlunet](https://github.com/tlunet) the computation of the + weights is now faster and (even) more reliable. No breaking of + any API here\... + - **Bugfixing and version pushes**: The code should run without + (many) complaints with Python 3.6, 3.7 and potentially above. + Also, the plotting routines have been adapted to work with + recent versions of `matplotlib`. + + This is not much (yet) and if it were not for the API changes, this + would have been a minor release. + +- August 30, 2019: Version 3.1 adds many more examples like the + nonlinear Schrödinger equation, more on Gray-Scott and in particular + Allen-Cahn. Those are many implemented using the parallel FFT + library + [mpi4pi-fft](https://bitbucket.org/mpi4py/mpi4py-fft/src/master/), + which can now be used with `pySDC`. There are now 8 + tutorials, where step 7 shows the usage of three external libraries + with `pySDC`: mpi4py, FEniCS and petsc4py. The MPI controller has + been improved after performaning a detailed performance analysis + using [Score-P](https://www.vi-hps.org/projects/score-p/) and + [Extrae](https://www.vi-hps.org/Tools/Extrae.html). Finally: first + steps towards error/iteration estimators are taken, too. + +- February 14, 2019: Released version 3 of `pySDC`. This + release is accompanied by the **ACM TOMS paper** + ["pySDC -- Prototyping spectral deferred corrections"](https://doi.org/10.1145/3310410). + It release contains some breaking changes to the API. In detail: + + - **Dropped Python 2 support**: Starting with this version, + `pySDC` relies on Python 3. Various incompabilities + led to inconsistent treatment of dependencies, so that parts of + the code had to use Python 2 while other relied on Python 3 or + could do both. We follow [A pledge to migrate to Python + 3](https://python3statement.org/) with this decision, as most + prominent dependencies of `pySDC` already do. + - **Unified controllers**: Instead of providing (and maintaining) + four different controllers, this release only has one for + emulated and one for MPI-based time-parallelization + (`controller_nonMPI` and `controller_MPI`). This should avoid + further confusion and makes the code easier to maintain. Both + controllers use the multigrid perspective for the algorithm + (first exchange data, than compute updates), but the classical + way of determining when to stop locally (each time-step is + stopped when ready, if the previous one is ready, too). The + complete multigrid behavior can be restored using a flag. All + included projects and tutorials have been adapted to this. + - **No more data types in the front-ends**: The redundant use of + data type specifications in the description dictionaries has + been removed. Data types are now declared within each problem + class (more precisely, in the header of the `__init__`-method to + allow inhertiance). All included projects and tutorials have + been adapted to this. + - **Renewed FEniCS support**: This release revives the deprecated + [FEniCS](https://fenicsproject.org/) support, now requiring at + least FEniCS 2018.1. The integration is tested using Travis-CI. + - **More consistent handling of local initial conditions**: The + treatment of `u[0]` and `f[0]` has been fixed and made + consistent throughout the code. + - As usual, many bugs have been discovered and fixed. + +- May 23, 3018: Version 2.4 adds support for + [petsc4py](https://bitbucket.org/petsc/petsc4py)! You can now use + [PETSc](http://www.mcs.anl.gov/petsc/) data types + (`pySDC` ships with DMDA for distributed structured + grids) and parallel solvers right from your examples and problem + classes. There is also a new tutorial (7.C) showing this in a bit + more detail, including communicator splitting for parallelization in + space and time. Warning: in order to get this to work you need to + install petsc4py and mpi4py first! Make sure both use MPICH3 + bindings. Downloading `pySDC` from PyPI does not include + these packages. + +- February 8, 2018: Ever got annoyed at `pySDC`\'s + incredibly slow setup phase when multiple time-steps are used? + Version 2.3 changes this by copying the data structure of the first + step to all other steps using the [dill + Package](https://pypi.python.org/pypi/dill). Setup times could be + reduced by 90% and more for certain problems. We also increase the + speed for certain calculations, in particular for the Penning trap + example. + +- November 7, 2017: Version 2.2 contains matrix-based versions of + PFASST within the project `matrixPFASST`. This involved quite a few + changes in more or less unexpected places, e.g. in the multigrid + controller and the transfer base class. The impact of these changes + on other projects should be negligible, though. + +- October 25, 2017: For the [6th Workshop on Parallel-in-Time + Integration](https://www.ics.usi.ch/index.php/6th-workshop-on-parallel-in-time-methods) + `pySDC` has been updated to version 2.1. It is now + available on PyPI - the Python Package Index, see + and can be installed simply by + using `pip install pySDC`. Naturally, this release contains a lot of + bugfixes and minor improvements. Most notably, the file structure + has been changed again to meet the standards for Python packaging + (at least a bit). + +- November 24, 2016: Released version 2 of `pySDC`. This + release contains major changes to the code and its structure: + + - **Complete redesign of code structure**: The `core` part of + `pySDC` only contains the core modules and classes, + while `implementations` contains the actual implementations + necessary to run something. This now includes separate files for + all collocation classes, as well as a collection of problems, + transfer classes and so on. Most examples have been ported to + either `tutorials`, `playgrounds` or `projects`. + - **Introduction of tutorials**: We added a tutorial (see below) + to explain many of pySDC\'s features in a step-by-step fashion. + We start with a simple spatial discretization and collocation + formulations and move step by step to SDC, MLSDC and PFASST. All + tutorials are accompanied by tests. + - **New all-inclusive controllers**: Instead of having two PFASST + controllers which could also do SDC and MLSDC (and more), we now + have four generic controllers which can do all these methods, + depending on the input. They are split into two by two class: + `MPI` and `NonMPI` for real or virtual + parallelisim as well as `classic` and + `multigrid` for the standard and multigrid-like + implementation of PFASST and the likes. Initialization has been + simplified a lot, too. + - **Collocation-based coarsening** As the standard PFASST + libraries [libpfasst](https://bitbucket.org/memmett/libpfasst) + and [PFASST++](https://github.com/Parallel-in-Time/PFASST) + `pySDC` now offers collocation-based coarsening, + i.e. the number of collocation nodes can be reduced during + coarsening. Also, time-step coarsening is in preparation, but + not implemented yet. + - **Testing and documentation** The core, implementations and + plugin packages and their subpackages are fully documented using + sphinx-apidoc, see below. This documentation as well as this + website are generated automatically using + [Travis-CI](https://travis-ci.org/Parallel-in-Time/pySDC). Most + of the code is supported by tests, mainly realized by using the + tutorial as the test routines with clearly defined results. + Also, projects are accompanied by tests. + - Further, minor changes: + - Switched to more stable barycentric interpolation for the + quadrature weights + - New collocation class: `EquidistantSpline_Right` + for spline-based quadrature + - Collocation tests are realized by generators and not by + classes + - Multi-step SDC (aka single-level PFASST) now works as + expected + - Reworked many of the internal structures for consistency and + simplicity + +:arrow_left: [Back to main page](./README.md) \ No newline at end of file diff --git a/CHANGELOG.rst b/CHANGELOG.rst deleted file mode 100644 index 56a97c88d6..0000000000 --- a/CHANGELOG.rst +++ /dev/null @@ -1,133 +0,0 @@ -Changelog ---------- - -- October 7, 2022: Version 5 comes with many changes, both visible and invisible ones. Some of those break the existing API, but - if you are using tests, you should be fine. Major changes include: - - - **New convergence controllers**: Checking whether a step has converged can be tricky, so we made separate modules out of these - checks. This makes features like adaptivity easier to implement. Also, the controllers have been streamlined a bit to make them more readable/digestible. - Thanks to `@brownbaerchen `_! - - **Adaptivity and error estimators**: SDC now comes with adaptivity and error estimation, leveraging the new convergence controllers out of the box. - Thanks to `@brownbaerchen `_! - - **New collocation classes**: We completely rewrote the way collocation nodes and weights are computed. It is now faster, more reliable, shorter, better. - But: this **breaks the API**, since the old collocation classes do not exist anymore. The projects, tutorials, tests and most of the playgrounds - have been adapted, so have a look over there to see `what to change `_. - Thanks to `@tlunet `_! - - **New projects**: Resilience and energy grid simulations are ready to play with and are waiting for more ideas! - We used this effort to condense and clean up the problem classes a bit, reducing the number of files and classes with only marginal differences significantly. - This could potentially **break your code**, too, if you rely on any of those affected ones. - Thanks to `@brownbaerchen `_ and `@lisawim `_! - - **Toward GPU computing**: We included a new data type based on `CuPy `_ making GPU computing possible. - Thanks to `@timo2705 `_! - - **Better testing**: The CI pipeline got a complete overhaul (again), now enabling simultaneous tests, faster/earlier linting, benchmarking (at least, in principal), separate environments and so on. - The code is tested under Ubuntu and MacOS. - - **Better code formatting**: `pySDC` now uses `black `_ and `flakeheaven `_ for cleaner source code. - After complaints here and there about linting "errors" the recommended way now is to run ``black`` before submission. - -- December 13, 2021: Version 4.2 brings compatibility with Python 3.9, including some code cleanup. The CI test - suite seems to run faster now, since sorting out the dependencies is faster. Tested `mamba `_, - which for now makes the CI pipeline much faster. Also, the CI workflow can now run locally using `act `_. - We introduced markers for soem of the tests in preparation of distributed tests on different platforms. And finally, a LaTeX - installation is no longer needed use plotting (but recommended). - -- August 11, 2021: Version 4.1 has some more changes under the hood, most of them with no significant impact to users. - The CI pipeline has been completely rewritten, porting the code to `Github Actions `_ - (away from `Travis CI `_), to `flake8 `_ and to `pytest `_ - (away from `nose `_). One thing that may have an impact on users is that following the changes - made in Version 4.0, the PETSc data structures are now much easier, removing a lot of unnecessary boilerplate code. - -- May 4, 2021: Long time, no see, but this major release 4.0 marks some improvements under the hood: - - - **Rewrote ``mesh`` and ``particle`` data type**: Creation of new arrays for each operation is now avoided by - directly subclassing Numpy's ``ndarray``. Somewhat faster, definitively better, less code, future-proof, but also breaking the API. If you use `pySDC` - for your project, make sure you adapt to the new data types (or don't upgrade). - - **Faster quadrature**: Thanks to `tlunet `_ the computation of the weights is now faster and - (even) more reliable. No breaking of any API here... - - **Bugfixing and version pushes**: The code should run without (many) complaints with Python 3.6, 3.7 and potentially above. - Also, the plotting routines have been adapted to work with recent versions of `matplotlib`. - - This is not much (yet) and if it were not for the API changes, this would have been a minor release. - -- August 30, 2019: Version 3.1 adds many more examples like the nonlinear Schrödinger equation, more on Gray-Scott and in particular Allen-Cahn. - Those are many implemented using the parallel FFT library `mpi4pi-fft `_, which can now be used with `pySDC`. - There are now 8 tutorials, where step 7 shows the usage of three external libraries with `pySDC`: mpi4py, FEniCS and petsc4py. - The MPI controller has been improved after performaning a detailed performance analysis using `Score-P `_ and `Extrae `_. - Finally: first steps towards error/iteration estimators are taken, too. - -- February 14, 2019: Released version 3 of `pySDC`. This release is accompanied by the **ACM TOMS paper** - `"pySDC -- Prototyping spectral deferred corrections" `_. - It release contains some breaking changes to the API. In detail: - - - **Dropped Python 2 support**: Starting with this version, `pySDC` relies on Python 3. Various incompabilities led - to inconsistent treatment of dependencies, so that parts of the code had to use Python 2 while other relied on - Python 3 or could do both. We follow `A pledge to migrate to Python 3 `_ with this decision, - as most prominent dependencies of `pySDC` already do. - - **Unified controllers**: Instead of providing (and maintaining) four different controllers, this release only has - one for emulated and one for MPI-based time-parallelization (``controller_nonMPI`` and ``controller_MPI``). - This should avoid further confusion and makes the code easier to maintain. Both controllers use the multigrid - perspective for the algorithm (first exchange data, than compute updates), but the classical way of determining - when to stop locally (each time-step is stopped when ready, if the previous one is ready, too). The complete multigrid - behavior can be restored using a flag. All included projects and tutorials have been adapted to this. - - **No more data types in the front-ends**: The redundant use of data type specifications in the description dictionaries - has been removed. Data types are now declared within each problem class (more precisely, in the header of the - ``__init__``-method to allow inhertiance). All included projects and tutorials have been adapted to this. - - **Renewed FEniCS support**: This release revives the deprecated `FEniCS `_ support, now requiring at least FEniCS 2018.1. - The integration is tested using Travis-CI. - - **More consistent handling of local initial conditions**: The treatment of ``u[0]`` and ``f[0]`` has been fixed and - made consistent throughout the code. - - As usual, many bugs have been discovered and fixed. - -- May 23, 3018: Version 2.4 adds support for `petsc4py `_! - You can now use `PETSc `_ data types (`pySDC` ships with DMDA for distributed structured grids) and parallel solvers right from your examples and problem classes. - There is also a new tutorial (7.C) showing this in a bit more detail, including communicator splitting for parallelization in space and time. - Warning: in order to get this to work you need to install petsc4py and mpi4py first! Make sure both use MPICH3 bindings. - Downloading `pySDC` from PyPI does not include these packages. - -- February 8, 2018: Ever got annoyed at `pySDC`'s incredibly slow setup phase when multiple time-steps are used? Version 2.3 - changes this by copying the data structure of the first step to all other steps using the `dill Package `_. - Setup times could be reduced by 90% and more for certain problems. We also increase the speed for certain calculations, - in particular for the Penning trap example. - -- November 7, 2017: Version 2.2 contains matrix-based versions of PFASST within the project ``matrixPFASST``. This involved quite a few - changes in more or less unexpected places, e.g. in the multigrid controller and the transfer base class. The impact - of these changes on other projects should be negligible, though. - -- October 25, 2017: For the `6th Workshop on Parallel-in-Time Integration `_ - `pySDC` has been updated to version 2.1. It is now available on PyPI - the Python Package Index, see `https://pypi.python.org/pypi/pySDC `_ - and can be installed simply by using ``pip install pySDC``. Naturally, this release contains a lot of bugfixes and minor improvements. - Most notably, the file structure has been changed again to meet the standards for Python packaging (at least a bit). - -- November 24, 2016: Released version 2 of `pySDC`. This release contains major changes to the code and its structure: - - - **Complete redesign of code structure**: The ``core`` part of `pySDC` only contains the core modules and classes, - while ``implementations`` contains the actual implementations necessary to run something. - This now includes separate files for all collocation classes, as well as a collection of problems, transfer classes and so on. - Most examples have been ported to either ``tutorials``, ``playgrounds`` or ``projects``. - - - **Introduction of tutorials**: We added a tutorial (see below) to explain many - of pySDC's features in a step-by-step fashion. We start with a simple spatial - discretization and collocation formulations and move step by step to SDC, MLSDC and PFASST. - All tutorials are accompanied by tests. - - - **New all-inclusive controllers**: Instead of having two PFASST controllers - which could also do SDC and MLSDC (and more), we now have four generic controllers - which can do all these methods, depending on the input. They are split into - two by two class: `MPI` and `NonMPI` for real or virtual parallelisim as well - as `classic` and `multigrid` for the standard and multigrid-like implementation - of PFASST and the likes. Initialization has been simplified a lot, too. - - - **Collocation-based coarsening** As the standard PFASST libraries `libpfasst `_ and `PFASST++ `_ - `pySDC` now offers collocation-based coarsening, i.e. the number of collocation nodes can be reduced during coarsening. - Also, time-step coarsening is in preparation, but not implemented yet. - - - **Testing and documentation** The core, implementations and plugin packages and their subpackages are fully documented using sphinx-apidoc, see below. - This documentation as well as this website are generated automatically using `Travis-CI `_. - Most of the code is supported by tests, mainly realized by using the tutorial as the test routines with clearly defined results. Also, projects are accompanied by tests. - - - Further, minor changes: - - - Switched to more stable barycentric interpolation for the quadrature weights - - New collocation class: `EquidistantSpline_Right` for spline-based quadrature - - Collocation tests are realized by generators and not by classes - - Multi-step SDC (aka single-level PFASST) now works as expected - - Reworked many of the internal structures for consistency and simplicity \ No newline at end of file diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index 6e2d801ca9..42afd279b1 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,5 +1,7 @@ # Contributor Covenant Code of Conduct +:arrow_left: [Back to main page](./README.md) + ## Our Pledge We as members, contributors, and leaders pledge to make participation in our @@ -126,3 +128,5 @@ enforcement ladder](https://github.com/mozilla/diversity). For answers to common questions about this code of conduct, see the FAQ at https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations. + +:arrow_left: [Back to main page](./README.md) \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000000..68a79a8b73 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,18 @@ +# How to contribute to pySDC + +Developments on the `pySDC` code use the classical approach of forks and pull requests. +You can look at [extended GitHub documentation](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/getting-started/about-collaborative-development-models) for more details (skip this if you are already used to it). Furthermore, branches on `pySDC` follow a pre-defined structure. To contribute to any of them, please look at the [pull request recommendations](./docs/contrib/01_pull_requests.md). + +Additionally, a _few_ rules are set to enforce code readability, consistency and reliability. Some of them are automatically tested with each commit, and summarized in the page on [continuous integration (CI)](./docs/contrib/02_continuous_integration.md). +Others are specific conventions chosen for the pySDC library, that may follow Python standards (or not ...), detailed in the [naming conventions](./docs/contrib/03_naming_conventions.md) page. + +Finally, while `pySDC` provides many base functionalities that implement classical flavors of SDC, it also allows problem-specific applications through Object-Oriented Programming (OOP) and the implementation of custom inherited classes. +This follows a specific OOP framework, you can look at the page on [custom implementations](./docs/contrib/04_custom_implementations.md) for more details. + +1. [GitHub Forks and Pull Requests](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/getting-started/about-collaborative-development-models) +2. [Pull Requests Recommendations](./docs/contrib/01_pull_requests.md) +3. [Continuous Integration](./docs/contrib/02_continuous_integration.md) +4. [Naming Conventions](./docs/contrib/03_naming_conventions.md) +5. [Custom Implementations](./docs/contrib/04_custom_implementations.md) + +:arrow_left: [Back to main page](./README.md) \ No newline at end of file diff --git a/README.md b/README.md new file mode 100644 index 0000000000..e7fa2ef81f --- /dev/null +++ b/README.md @@ -0,0 +1,116 @@ +[![badge-ga](https://github.com/Parallel-in-Time/pySDC/actions/workflows/ci_pipeline.yml/badge.svg?branch=master)](https://github.com/Parallel-in-Time/pySDC/actions/workflows/ci_pipeline.yml) +[![badge-ossf](https://bestpractices.coreinfrastructure.org/projects/6909/badge)](https://bestpractices.coreinfrastructure.org/projects/6909) +[![badge-cc](https://codecov.io/gh/Parallel-in-Time/pySDC/branch/master/graph/badge.svg?token=hpP18dmtgS)](https://codecov.io/gh/Parallel-in-Time/pySDC) +[![zenodo](https://zenodo.org/badge/26165004.svg)](https://zenodo.org/badge/latestdoi/26165004) + +# Welcome to pySDC! + +The `pySDC` project is a Python implementation of the +spectral deferred correction (SDC) approach and its flavors, esp. the +multilevel extension MLSDC and PFASST. It is intended for rapid +prototyping and educational purposes. New ideas like e.g. sweepers or +predictors can be tested and first toy problems can be easily +implemented. + +## Features + +- Variants of SDC: explicit, implicit, IMEX, multi-implicit, Verlet, + multi-level, diagonal, multi-step +- Variants of PFASST: virtual parallel or MPI-based parallel, + classical of multigrid perspective +- 8 tutorials: from setting up a first collocation problem to SDC, + PFASST and advanced topics +- Projects: many documented projects with defined and tested outcomes +- Many different examples, collocation types, data types already + implemented +- Works with [FEniCS](https://fenicsproject.org/), + [mpi4py-fft](https://mpi4py-fft.readthedocs.io/en/latest/) and + [PETSc](http://www.mcs.anl.gov/petsc/) (through + [petsc4py](https://bitbucket.org/petsc/petsc4py)) +- Continuous integration via [GitHub + Actions](https://github.com/Parallel-in-Time/pySDC/actions) and + [Gitlab CI](https://gitlab.hzdr.de/r.speck/pysdc/-/pipelines) +- Fully compatible with Python 3.7 - 3.10, runs at least on Ubuntu and + MacOS + +## Getting started + +The code is hosted on GitHub, see +, and PyPI, see +. While using `pip install pySDC` +will give you a core version of `pySDC` to work with, +working with the developer version is most often the better choice. We +thus recommend to checkout the code from GitHub and install the +dependencies e.g. by using a [conda](https://conda.io/en/latest/) +environment. For this, `pySDC` ships with environment files +which can be found in the folder `etc/`. Use these as e.g. + +``` bash +conda env create --yes -f etc/environment-base.yml +``` + +To check your installation, run + +``` bash +pytest pySDC/tests -m NAME +``` + +where `NAME` corresponds to the environment you chose (`base` in the +example above). You may need to update your `PYTHONPATH` by running + +``` bash +export PYTHONPATH=$PYTHONPATH:/path/to/pySDC/root/folder +``` + +in particular if you want to run any of the playgrounds, projects or +tutorials. All `import` statements there assume that the +`pySDC`\'s base directory is part of `PYTHONPATH`. + +For many examples, `LaTeX` is used for the plots, i.e. a +decent installation of this is needed in order to run those examples. +When using `fenics` or `petsc4py`, a C++ +compiler is required (although installation may go through at first). + +For more details on `pySDC`, check out http://www.parallel-in-time.org/pySDC. + +## How to cite + +If you use pySDC or parts of it for your work, great! Let us know if we +can help you with this. Also, we would greatly appreciate a citation of +[this paper](https://doi.org/10.1145/3310410): + +> Robert Speck, **Algorithm 997: pySDC - Prototyping Spectral Deferred +> Corrections**, ACM Transactions on Mathematical Software (TOMS), +> Volume 45 Issue 3, August 2019, + +The current software release can be cited using Zenodo: +[![zenodo](https://zenodo.org/badge/26165004.svg)](https://zenodo.org/badge/latestdoi/26165004) + +## Contributing + +`pySDC` code was originally developed by [Robert Speck (@pancetta)](https://github.com/pancetta), +and is now maintained and developed by a small community of scientists interested in SDC methods. +Checkout the [Changelog](./CHANGELOG.md) to see pySDC's evolution since 2016. + +Any contribution is dearly welcome ! If you want to take part of this, please take the time to read our [Contribution Guidelines](./CONTRIBUTING.md) +(and don't forget to take a pick at our nice [Code of Conduct](./CODE_OF_CONDUCT.md) :wink:). + + +## Acknowledgements + +This project has received funding from the [European High-Performance +Computing Joint Undertaking](https://eurohpc-ju.europa.eu/) (JU) under +grant agreement No 955701 ([TIME-X](https://www.time-x-eurohpc.eu/)). +The JU receives support from the European Union's Horizon 2020 research +and innovation programme and Belgium, France, Germany, and Switzerland. +This project also received funding from the [German Federal Ministry of +Education and Research](https://www.bmbf.de/bmbf/en/home/home_node.html) +(BMBF) grant 16HPC047. The project also received help from the +[Helmholtz Platform for Research Software Engineering - Preparatory +Study (HiRSE_PS)](https://www.helmholtz-hirse.de/). + +

+          +          + +

\ No newline at end of file diff --git a/README.rst b/README.rst deleted file mode 100644 index e420177c73..0000000000 --- a/README.rst +++ /dev/null @@ -1,89 +0,0 @@ -|badge-ga| -|badge-ossf| -|badge-cc| -|zenodo| - -Welcome to pySDC! -================= - -The `pySDC` project is a Python implementation of the spectral deferred correction (SDC) approach and its flavors, -esp. the multilevel extension MLSDC and PFASST. It is intended for rapid prototyping and educational purposes. -New ideas like e.g. sweepers or predictors can be tested and first toy problems can be easily implemented. - - -Features --------- - -- Variants of SDC: explicit, implicit, IMEX, multi-implicit, Verlet, multi-level, diagonal, multi-step -- Variants of PFASST: virtual parallel or MPI-based parallel, classical of multigrid perspective -- 8 tutorials: from setting up a first collocation problem to SDC, PFASST and advanced topics -- Projects: many documented projects with defined and tested outcomes -- Many different examples, collocation types, data types already implemented -- Works with `FEniCS `_, `mpi4py-fft `_ and `PETSc `_ (through `petsc4py `_) -- Continuous integration via `GitHub Actions `_ and `Gitlab CI `_ -- Fully compatible with Python 3.7 - 3.10, runs at least on Ubuntu and MacOS - - -Getting started ---------------- - -The code is hosted on GitHub, see `https://github.com/Parallel-in-Time/pySDC `_, and PyPI, see `https://pypi.python.org/pypi/pySDC `_. -While using ``pip install pySDC`` will give you a core version of `pySDC` to work with, working with the developer version -is most often the better choice. We thus recommend to checkout the code from GitHub and install the dependencies e.g. by using a `conda `_ environment. -For this, `pySDC` ships with environment files which can be found in the folder ``etc/``. Use these as e.g. - -.. code-block:: bash - - conda env create --yes -f etc/environment-base.yml - -To check your installation, run - -.. code-block:: bash - - pytest pySDC/tests -m NAME - -where ``NAME`` corresponds to the environment you chose (``base`` in the example above). -You may need to update your ``PYTHONPATH`` by running - -.. code-block:: bash - - export PYTHONPATH=$PYTHONPATH:/path/to/pySDC/root/folder - -in particular if you want to run any of the playgrounds, projects or tutorials. -All ``import`` statements there assume that the `pySDC`'s base directory is part of ``PYTHONPATH``. - -For many examples, `LaTeX` is used for the plots, i.e. a decent installation of this is needed in order to run those examples. -When using `fenics` or `petsc4py`, a C++ compiler is required (although installation may go through at first). - -For more details on `pySDC`, check out `http://www.parallel-in-time.org/pySDC `_. - - -How to cite ------------ - -If you use pySDC or parts of it for your work, great! Let us know if we can help you with this. Also, we would greatly appreciate a citation of `this paper `_: - - Robert Speck, **Algorithm 997: pySDC - Prototyping Spectral Deferred Corrections**, - ACM Transactions on Mathematical Software (TOMS), Volume 45 Issue 3, August 2019, - `https://doi.org/10.1145/3310410 `_ - -The current software release can be cited using Zenodo: |zenodo| - -.. |zenodo| image:: https://zenodo.org/badge/26165004.svg - :target: https://zenodo.org/badge/latestdoi/26165004 - -Acknowledgements ----------------- - -This project has received funding from the `European High-Performance Computing Joint Undertaking `_ (JU) under grant agreement No 955701 (`TIME-X `_). -The JU receives support from the European Union’s Horizon 2020 research and innovation programme and Belgium, France, Germany, and Switzerland. -This project also received funding from the `German Federal Ministry of Education and Research `_ (BMBF) grant 16HPC047. -The project also received help from the `Helmholtz Platform for Research Software Engineering - Preparatory Study (HiRSE_PS) `_. - - -.. |badge-ga| image:: https://github.com/Parallel-in-Time/pySDC/actions/workflows/ci_pipeline.yml/badge.svg?branch=master - :target: https://github.com/Parallel-in-Time/pySDC/actions/workflows/ci_pipeline.yml -.. |badge-ossf| image:: https://bestpractices.coreinfrastructure.org/projects/6909/badge - :target: https://bestpractices.coreinfrastructure.org/projects/6909 -.. |badge-cc| image:: https://codecov.io/gh/Parallel-in-Time/pySDC/branch/master/graph/badge.svg?token=hpP18dmtgS - :target: https://codecov.io/gh/Parallel-in-Time/pySDC diff --git a/docs/contrib/01_pull_requests.md b/docs/contrib/01_pull_requests.md new file mode 100644 index 0000000000..9cb2dc21f6 --- /dev/null +++ b/docs/contrib/01_pull_requests.md @@ -0,0 +1,60 @@ +# Recommendations for pull requests + +Contributions on the `pySDC` code is expected to be done through pull requests from personal (public) forked repositories. A few core developers can eventually push maintenance commits directly to the main repository. However (even for core developers), it is highly recommended to add specific contribution trough dedicated merge requests from forks. + +## Contributing to the main branch + +The main version of `pySDC` is hosted in the `master` branch, on which any contributor can propose pull requests. Those can consist on : + +- bug fixes and code corrections (_e.g_ solving one of the current [issues](https://github.com/Parallel-in-Time/pySDC/issues)) +- addition or improvement of documentation +- improvement of existing functionalities (performance, accuracy, usage, ...) +- addition of new functionalities and applications +- improvement of CI test routines + +Pull request should comes from forks branches with a name specific to the contribution. For instance : + +``` +# branch name : +issue214 # to solve issue 214 +awesome_new_project # to add a new project +some_feature # to add a new feature (implementation, ...) +``` + +> :scroll: Favor the use of _short name_ for branch, using _lower case_ and eventually underscores to ease readability. + +Those changes should be compatible with the existing API (_i.e_ not break it), and **avoid any change** in the current user interface. In particular, it should not modify default values for parameters or remove attributes of existing classes. But new attributes or parameters can be added with pre-set default values, and new classes can be added in the `pySDC.implementations` module. + +> :bell: During the revision of your pull request, it can happen that additional changes are done to the `upstream/master` branch (in parallel-in-time/pySDC repo). In that case, don't hesitate to regularly merge them into your local branch to solve eventual conflicts, for instance : +> +> ```bash +> # On your local repo, with the "my_feature" branch +> $ git fetch upstream # synchronize with parallel-in-time/pySDC +> $ git merge upstream/master # merge into my_feature +> $ git push # push local merges to your repository +> ``` +> +> The pull request will be updated with any merge changes on the `my_feature` branch of your repository. + + +## Release development branches + +Branches with name starting with `v[...]` are development branches for the next releases of `pySDC` (_e.g_ `v5`, `v6`, ...). Those may introduce API-breaking changes (user interface, structure of core classes) that would force re-writing application scripts using `pySDC` (_e.g_ tutorials, projects, ...). Contribution to those branches are done by core developers, but anyone can also propose pull requests on those branches once the roadmap and milestones for the associated release has been written down in a dedicated issue. +Such branches are merged to `master` when ready. + +> :scroll: Pull request to those branches can be done from fork branches using the **same name** as the release branch. + +> :bell: **Never** merge modifications on the `upstream/master` branch into your own local release branch. If some commit on the master branch have to be taken into account in the release branch (for instance, v6), then first request a merge of `upstream/master` into `upstream/v6`, merge `upstream/v6` into your local `v6` branch, then push into your own repository to update the pull request. + +## Feature development branches + +Additional branches starting with the prefix `dev/[...]` can be used to add new features, that cannot be added with only one pull request (for instance, when several developers work on it). +Those could eventually be merged into master if they don't break the API, or to the next release branch if they do. + +> :scroll: Pull request to those branches can be done from fork branches using the **same name** as the feature branch. + +> :bell: **Never** merge modifications on `upstream/master` or any release branch into your own local development branch (same comment and solution as for the release branches above). + + +:arrow_left: [Back to Contributing Summary](./../../CONTRIBUTING.md) --- +:arrow_right: [Next to Continuous Integration](./02_continuous_integration.md) \ No newline at end of file diff --git a/docs/contrib/02_continuous_integration.md b/docs/contrib/02_continuous_integration.md new file mode 100644 index 0000000000..245621c23a --- /dev/null +++ b/docs/contrib/02_continuous_integration.md @@ -0,0 +1,75 @@ +# Continuous Integration in pySDC + +Any commit in `pySDC` are tested within by GitHub continuous integration (CI). You can see in in the [action panel](https://github.com/Parallel-in-Time/pySDC/actions) the tests for each branches. +Those tests can be divided in two main categories : [code linting](#code-linting) and [code testing](#code-testing). +Finally, the CI also build artifacts that are used to generate the documentation website (see http://parallel-in-time.org/pySDC/), more details given in the [documentation generation](#documentation-generation) section. + +## Code linting + +Code style linting is performed using [black](https://black.readthedocs.io/en/stable/) and [flakeheaven](https://flakeheaven.readthedocs.io/en/latest/) for code syntax checking. In particular, `black` is used to check compliance with (most of) [PEP-8 guidelines](https://peps.python.org/pep-0008/). + +Those tests are conducted for each commit (even for forks), but you can also run it locally in the root folder of `pySDC` before pushing any commit : + +```bash +# Install required packages (works also with conda/mamba) +pip install black flakeheaven flake8-comprehensions flake8-bugbear +# First : test code style linting with black +black pySDC --check --diff --color +# Second : test code syntax with flakeheaven +flakeheaven lint --benchmark pySDC +``` + +> :bell: To avoid any error about formatting (`black`), you can simply use this program to reformat directly your code using the command : +> +> ```bash +> black pySDC +> ``` + +Some style rules that are automatically enforced : + +- lines should be not longer than 120 characters +- arithmetic operators (`+`, `*`, ...) should be separated with variables by one empty space + +## Code testing + +This is done using [pytest](https://docs.pytest.org/en/7.2.x/), and runs all the tests written in the `pySDC/tests` folder. You can run those locally in the root folder of `pySDC` using : + +```bash +# Install required packages (works also with conda/mamba) +pip install pytest<7.2.0 pytest-benchmark coverage[toml] +# Run tests +pytest -v pySDC/tests +``` + +> :bell: Many components are tested (core, implementations, projects, tutorials, etc ...) which make the testing quite long. +> When working on a single part of the code, you can run only the corresponding part of the test by specifying the test path, for instance : +> +> ```bash +> pytest -v pySDC/tests/test_nodes.py # only test nodes generation +> ``` + +## Documentation generation + +Documentation is built using [sphinx](https://www.sphinx-doc.org/en/master/). +To check its generation, you can wait for all the CI tasks to download the `docs` artifacts, unzip it and open the `index.html` file there with you favorite browser. + +However, when you are working on documentation (of the project, of the code, etc ...), you can already build and check the website locally : + +```bash +# Run all tests, continuing even with errors +pytest --continue-on-collection-errors -v --durations=0 pySDC/tests +# Generate rst files for sphinx +./docs/update_apidocs.sh +# Generate html documentation +sphinx-build -b html docs/source docs/build/html +``` + +Then you can open `docs/build/html/index.html` using you favorite browser and check how your own documentation looks like on the website. + +> :bell: **Important** : running all the tests is necessary to generate graphs and images used by the website. +> But you can still generate the website without it : just all images for the tutorials, projects and playgrounds will be missing. +> This approach can be considered for local testing of your contribution when it does not concern parts containing images (_i.e_ project or code documentation). + +:arrow_left: [Back to Pull Request Recommendation](./01_pull_requests.md) --- +:arrow_up: [Contributing Summary](./../../CONTRIBUTING.md) --- +:arrow_right: [Next to Naming Conventions](./03_naming_conventions.md) \ No newline at end of file diff --git a/docs/contrib/03_naming_conventions.md b/docs/contrib/03_naming_conventions.md new file mode 100644 index 0000000000..d2d06c4e2a --- /dev/null +++ b/docs/contrib/03_naming_conventions.md @@ -0,0 +1,144 @@ +# Naming conventions in pySDC + +> :scroll: Those rules may not be enforced by the current implementation of pySDC. However, they should be enforced for any contribution. + +Naming convention are mostly inspired from the [PEP-8 guidelines](https://peps.python.org/pep-0008/), even if some of them may be different. Of course, strictly following those rules is not always the best solution, as Guido Von Rossum's key insight states : + +> _A Foolish Consistency is the Hobgoblin of Little Minds_ + +The most important idea at the end is to find a optimal compromise between + +- readability : _Can someone else easily read and understand my code ?_ +- effectiveness : _Does my code avoid kilometers-long lines to do simple things ?_ + +Both aspects are interdependent to ease maintaining/development of any code and improve its attractiveness to potential users. + +## First definitions + +Possible Naming formats : + +- all-lowercase : `variablenamelikethis` +- snake_case : `variable_name_like_this` +- PascalCase : `VariableNameLikeThis` +- camelCase : `variableNameLikeThis` +- all-uppercase with underscore : `VARIABLE_NAME_LIKE_THIS` +- all-uppercase with minus : `VARIABLE-NAME-LIKE-THIS` + +## Packages and modules names + +Modules should have short, all-lowercase names. Underscores can be used in the module name if it improves readability (_i.e_ use snake_case only if it helps, +else try to stick to all-lowercase). +Python packages should also have short, all-lowercase names, although the use of underscores is discouraged. + +## Class names + +Class names should use PascalCase formatting, for instance : + +```python +class AdvectionDiffusion(Problem): + pass +``` + +The shorter, the better. Also, exception class names should end with the suffix `Error`, for instance + +```python +class ParameterError(Exception): + pass +``` + +## Function and variables names + +Function (or method) and variable names should use camelCase formatting, and same goes for function arguments. For instance : + +```python +tLeft = 1 +quadType = 'LEGENDRE' + +def computeFejerRule(nNodes): + # ... + +class NodeGenerator(): + def getOrthogPolyCoeffs(self, nCoeffs): + # ... +``` + +:scroll: A few additional notes : + +1. In general, shorter name (eventually with abbreviations) should be favored, **as long as it does not deteriorate understandability**. For instance `getOrthogPolyCoeffs` rather than `getOrthogonalPolynomialCoefficients`. +2. Suffix `s` for plural should be used even with abbreviations for consistency (_e.g_ `nCoeffs`, `nNodes`, ...). +3. Acronyms can be used to simplify variable names, but **try not to start with it**. For instance, favor `jacobiMSSDC` or `multiStepSDCJacobi` rather than `MSSDCJacobi`. In general, acronyms should be put at the end of variable names. +4. Underscore can exceptionally be used at the end of variable names when it make readability better and ease further developments. In that case, the characters after the underscore **should be all-uppercase with underscore** (minus is not allowed by Python syntax). For instance when defining the same method with different specializations : + +```python +class MySweeper(Sweeper): + + def __init__(self, initSweep): + try: + self.initSweep = getattr(self, f'_initSweep_{initSweep}') + except AttributeError: + raise NotImplementedError(f'initSweep={initSweep}') + + def _initSweep_COPY(self): + pass + + def _initSweep_SPREAD(self): + pass + + # ... other implementations for initSweep +``` + +## Private and public attributes + +There is no such thing as private or public attributes in Python. But some attributes, if uses only within the object methods, can be indicated as private using the `_` prefix. For instance : + +```python +class ChuckNorris(): + + def __init__(self, param): + self.param = param + + def _think(self): + print('...') + + def act(self): + if self.param == 'doubt': + self._think() + print('*?%&$?*§"$*$*§#{*') +``` + +:scroll: In general, variable name starting with double underscore `__` are usually left for Python built-in names, _e.g_ `__dict__`, `__init__`, ... + +## Constants + +Constants are usually defined on a module level and written in all-uppercase with underscores (all-uppercase with minus are not allowed by Python syntax). Examples : + +```python +NODE_TYPES = ['EQUID', 'LEGENDRE', 'CHEBY-1', 'CHEBY-2', 'CHEBY-3', 'CHEBY-4'] +QUAD_TYPES = ['GAUSS', 'RADAU-LEFT', 'RADAU-RIGHT', 'LOBATTO'] +``` + +For _constant string values_, however, favor the use of all uppercase with minus, _e.g_ `RADAU-RIGHT`, `LEGENDRE-NUMPY` to distinguish those from constants names. + +:bell: When constants are used, for instance, to select method specializations (with suffix using all-uppercase with underscore), it is probably better to keep all-uppercase with minus for constant string values and add a character replacement in between, for instance : + +```python +class MySweeper(Sweeper): + + def __init__(self, initSweep): + try: + self.initSweep = getattr(self, f'_initSweep_{initSweep.replace('-','_')}') + except AttributeError: + raise NotImplementedError(f'initSweep={initSweep}') + + def _initSweep_COPY_PASTE(self): + pass + + def _initSweep_SPREAD_OUT(self): + pass + + # ... other implementations for initSweep +``` + +:arrow_left: [Back to Continuous Integration](./02_continuous_integration.md) --- +:arrow_up: [Contributing Summary](./../../CONTRIBUTING.md) --- +:arrow_right: [Next to Custom Implementations](./04_custom_implementations.md) \ No newline at end of file diff --git a/docs/contrib/04_custom_implementations.md b/docs/contrib/04_custom_implementations.md new file mode 100644 index 0000000000..bd9e997f04 --- /dev/null +++ b/docs/contrib/04_custom_implementations.md @@ -0,0 +1,7 @@ +# Custom implementation guidelines + +... in construction ... + +:arrow_left: [Back to Naming Conventions](./03_naming_conventions.md) --- +:arrow_up: [Contributing Summary](./../../CONTRIBUTING.md) --- +:arrow_right: [Next to a cute picture of cat](https://www.vecteezy.com/photo/2098203-silver-tabby-cat-sitting-on-green-background) \ No newline at end of file diff --git a/docs/convert_markdown.py b/docs/convert_markdown.py new file mode 100755 index 0000000000..b12e613f5b --- /dev/null +++ b/docs/convert_markdown.py @@ -0,0 +1,108 @@ +#!/usr/bin/env python3 +# -*- coding: utf-8 -*- +""" +Created on Tue Jan 17 19:47:56 2023 + +@author: telu +""" +import os +import glob +import json +import m2r2 +import shutil +import numpy as np + +mdFiles = [ + 'README.md', + 'CONTRIBUTING.md', + 'CHANGELOG.md', + 'CODE_OF_CONDUCT.md', + 'docs/contrib'] + +docSources = 'docs/source' + +# Move already images in the future build directory +os.makedirs('docs/build/html/_images/', exist_ok=True) +shutil.copytree('docs/img', 'docs/build/html/_images/docs/img', dirs_exist_ok=True) + +counter = np.array(0) + +with open('docs/emojis.json') as f: + emojis = set(json.load(f).keys()) + +def wrappEmojis(rst): + for emoji in emojis: + rst = rst.replace(emoji, f'|{emoji}|') + return rst + +def addSectionRefs(rst, baseName): + sections = {} + lines = rst.splitlines() + # Search for sections in rst file + for i in range(len(lines)-2): + conds = [ + len(lines[i+1]) and lines[i+1][0] in ['=', '-', '^', '"'], + lines[i+2] == lines[i-1] == '', + len(lines[i]) == len(lines[i+1])] + if all(conds): + sections[i] = lines[i] + # Add unique references before each section + for i, title in sections.items(): + ref = '-'.join([elt for elt in title.lower().split(' ') if elt != '']) + for char in ['#', "'", '^', '°', '!']: + ref = ref.replace(char, '') + ref = f'{baseName}/{ref}' + lines[i] = f'.. _{ref}:\n\n'+lines[i] + # Returns all concatenated lines + return '\n'.join(lines) + +def completeRefLinks(rst, baseName): + i = 0 + while i != -1: + i = rst.find(':ref:`', i) + if i != -1: + iLink = rst.find('<', i) + rst = rst[:iLink+1]+f'{baseName}/'+rst[iLink+1:] + i += 6 + return rst + +def addOrphanTag(rst): + return '\n:orphan:\n'+rst + +def setImgPath(rst): + i = 0 + while i != -1: + i = rst.find('