Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
87 commits
Select commit Hold shift + click to select a range
3ae7543
Started introducing defaults to a advection
Dec 16, 2022
db9c81f
Merge remote-tracking branch 'upstream/master' into problem_defaults
Jan 9, 2023
090af96
Removed `**kwargs` from the core problem class
Jan 9, 2023
edccb25
Put back the `**kwargs` :P
Jan 9, 2023
34fb360
Merge pull request #238 from Parallel-in-Time/master
brownbaerchen Jan 9, 2023
401694f
Merge pull request #239 from brownbaerchen/problem_defaults
brownbaerchen Jan 9, 2023
706533a
TL: trying solution to replace frozenclass
tlunet Jan 12, 2023
5b7f535
TL: reintroduced Gauss initial solution in advectionNd (1D)
tlunet Jan 13, 2023
7bbf286
TL: cleaned implementation of advectionNd
tlunet Jan 13, 2023
510d969
TL: appreciating flakeheaven help on this one ;)
tlunet Jan 13, 2023
7a24e72
TL: readOnly to dictionnary
tlunet Jan 13, 2023
c86e478
TL: better _readOnly instantiation
tlunet Jan 13, 2023
1c41e0a
TL: tried to simplifiy read-only parameters
tlunet Jan 13, 2023
d6dfe6a
TL: moved RegisterParams and ReadOnlyError classes in proper modules
tlunet Jan 15, 2023
e829f9d
TL: minor correction
tlunet Jan 15, 2023
3557e22
TL: other minor bug
tlunet Jan 15, 2023
8be4559
TL: added datatype initializer for Problem class
tlunet Jan 15, 2023
029c64b
TL: that's a new one ...
tlunet Jan 15, 2023
08667eb
TL: adding metaclass for RegisterParams class
tlunet Jan 16, 2023
d77ea23
TL: putting ndim to property for advectionNd class
tlunet Jan 16, 2023
6c989cb
TL: adding documentation
tlunet Jan 16, 2023
6b70df8
Merge remote-tracking branch 'upstream/master' into v6
tlunet Jan 16, 2023
2179e15
Merge pull request #252 from Parallel-in-Time/master
tlunet Jan 16, 2023
3c10b63
TL: merge attribute storing and parameter registering
tlunet Jan 17, 2023
a0351fc
TL: documentation update
tlunet Jan 17, 2023
586d2cc
Merge remote-tracking branch 'upstream/v6' into v6
tlunet Jan 17, 2023
d8540a7
Merge pull request #259 from Parallel-in-Time/master
tlunet Jan 19, 2023
21fac8f
TL: corrected docstring
tlunet Jan 19, 2023
60c9ec6
Merge remote-tracking branch 'upstream/v6' into v6
tlunet Jan 19, 2023
ed38531
Merge pull request #244 from tlunet/v6
brownbaerchen Jan 23, 2023
25e45c5
Merge branch 'master' into v6
tlunet Jan 23, 2023
db20e87
Merge remote-tracking branch 'upstream/master' into v6
tlunet Jan 24, 2023
847e8ca
TL: first draft for new simplified datatype classes
tlunet Jan 24, 2023
3f7c6b7
TL: reorganized datatype playground, started performance tests
tlunet Jan 24, 2023
f6f6988
TL: labels to performance plot
tlunet Jan 25, 2023
b275d86
TL: switched to proper timing functions
tlunet Jan 26, 2023
19e596c
Merge remote-tracking branch 'upstream/master' into v6
tlunet Feb 11, 2023
abf4da2
TL: switch HeatEquation_ND_FD to default parameter format
tlunet Feb 11, 2023
d151d80
TL: switch TestEquation_0D to new parameter format
tlunet Feb 11, 2023
7afac92
TL: 51/100 pytest work
tlunet Feb 12, 2023
a8db607
TL: blackening
tlunet Feb 12, 2023
605bccb
TL: test AC are passing
tlunet Feb 12, 2023
22da733
TL: test DAE are passing
tlunet Feb 12, 2023
ddbe309
TL: test asympconv are passing
tlunet Feb 12, 2023
5de18d4
TL: moved dtype as problem class attribute, more project tests passing
tlunet Feb 12, 2023
e5df699
TL: more project test are passing + black
tlunet Feb 12, 2023
fa54391
TL: almost all project tests passing
tlunet Feb 12, 2023
c8ab429
TL: blackening
tlunet Feb 12, 2023
56c61fd
TL: going to tutorials
tlunet Feb 12, 2023
7454cfb
cupy tests should work now
Mar 1, 2023
ec0539d
Merge remote-tracking branch 'upstream/v6' into v6
tlunet Mar 1, 2023
320730c
Moved solver from advection and diffusion to generic parent class
Mar 1, 2023
129d8f3
Merge pull request #280 from brownbaerchen/v6
brownbaerchen Mar 1, 2023
fb7329a
Merge remote-tracking branch 'upstream/v6' into v6
tlunet Mar 1, 2023
58ae1f9
TL: finished debug tutorials
tlunet Mar 1, 2023
625f206
Merge remote-tracking branch 'upstream/master' into v6
tlunet Mar 1, 2023
7accce1
Added work counters to generic FD problem class
Mar 1, 2023
11776fc
TL: forgot those tests
tlunet Mar 1, 2023
4d3bd17
TL: forgot black (of course)
tlunet Mar 1, 2023
02e186f
TL: some additional correction with solver_type
tlunet Mar 1, 2023
d1ece1e
Changed solver names to upper case
Mar 2, 2023
9974d97
Changed solver interface for GPU heat equation
Mar 2, 2023
2c5d50f
TL: reformating and testing for Logistic Equation problem
tlunet Mar 3, 2023
2f946ce
TL: black ... again
tlunet Mar 3, 2023
ba3e4d1
TL: hack to solve dolfin/mpi4py MPI clash
tlunet Mar 3, 2023
f3aa24c
TL: oh holy black ...
tlunet Mar 3, 2023
f8f3f15
TL: finished preleminary doc for v6 + some pragma
tlunet Mar 20, 2023
1a73bd5
Merge remote-tracking branch 'upstream/master' into v6
tlunet Mar 20, 2023
69ad4c3
TL: forgot that ...
tlunet Mar 20, 2023
ff5ffdb
TL: fixed LeakySuperconductor tests
tlunet Mar 20, 2023
8a216b6
TL: typo and coverage pragma
tlunet Mar 23, 2023
8fe945e
TL: fixing dictionnary typo
tlunet Mar 23, 2023
bd134ba
TL: updated problem classes not covered by tests
tlunet Mar 23, 2023
9f98a8d
TL: of course !
tlunet Mar 23, 2023
d47b866
TL: moving pragma cover
tlunet Mar 23, 2023
2ebcf05
TL: default parameters for FastWaveSlowWave
tlunet Mar 23, 2023
23e9385
TL: added docstring page in contribution guidelines
tlunet Mar 23, 2023
03cd1b7
TL: replaced dolfin/mpi4py hack by a TODO
tlunet Mar 23, 2023
f0acf24
TL: explicitly check mpi4py imports in tests
tlunet Mar 23, 2023
5ab7a12
TL: more elegant solution to solve the dolfin/mpi4py clash
tlunet Mar 23, 2023
ece2b63
TL: trying to fix pragma cover
tlunet Mar 23, 2023
fd62146
TL: forgot that ...
tlunet Mar 23, 2023
cec04ad
TL: another try
tlunet Mar 23, 2023
ea61807
TL: maybe a more elegant way
tlunet Mar 23, 2023
829484e
TL: fixing previous mistake
tlunet Mar 23, 2023
efc0fad
TL: black, I hate you
tlunet Mar 23, 2023
348606c
TL: codecov is also asking for it ...
tlunet Mar 23, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,13 @@ Additionally, a _few_ rules are set to enforce code readability, consistency and
Others are specific conventions chosen for the pySDC library, that may follow Python standards (or not ...), detailed in the [naming conventions](./docs/contrib/03_naming_conventions.md) page.

Finally, while `pySDC` provides many base functionalities that implement classical flavors of SDC, it also allows problem-specific applications through Object-Oriented Programming (OOP) and the implementation of custom inherited classes.
This follows a specific OOP framework, you can look at the page on [custom implementations](./docs/contrib/04_custom_implementations.md) for more details.
This follows a specific OOP framework, you can look at the page on [custom implementations](./docs/contrib/04_custom_implementations.md) for more details. Additional guideline are also given on how to [document the code](./docs/contrib/05_documenting_code.md) in `pySDC`.

1. [GitHub Forks and Pull Requests](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/getting-started/about-collaborative-development-models)
2. [Pull Requests Recommendations](./docs/contrib/01_pull_requests.md)
3. [Continuous Integration](./docs/contrib/02_continuous_integration.md)
4. [Naming Conventions](./docs/contrib/03_naming_conventions.md)
5. [Custom Implementations](./docs/contrib/04_custom_implementations.md)
6. [Documenting Code](./docs/contrib/05_documenting_code.md)

:arrow_left: [Back to main page](./README.md)
14 changes: 6 additions & 8 deletions docs/contrib/02_continuous_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ This stage allows to checks how much of the `pySDC` code is tested by the previo
- `pySDC/projects`
- `pySDC/tutorial`

This analysis is done in parallel to the test each time a pull is done on any branch (main repository or fork).
This analysis is done in parallel to the test each time a pull is done on any branch (main repository or fork).
You can look at the current coverage report for the master branch [here](https://parallel-in-time.org/pySDC/coverage/index.html) or compare the results with previous builds [here](https://app.codecov.io/gh/Parallel-in-Time/pySDC). Codecov will also comment on any pull request, indicating the change of coverage.

During developments, you can also run the coverage tests locally, using :
Expand All @@ -82,20 +82,18 @@ This will generate the coverage report in a `htmlcov` folder, and you can open t
### Coverage exceptions

Some types of code lines will be ignored by the coverage analysis (_e.g_ lines starting with `raise`, ...), see the `[tool.coverage.report]` section in `pyproject.toml`.
Part of code (functions, conditionaly, for loops, etc ...) can be ignored by coverage analysis using the `# pragma: no cover`, for instance
Part of code (functions, conditionaly, for loops, etc ...) can be ignored by coverage analysis using the `# pragma: no cover`, for instance

```python
# ...
# code analyzed by coverage
# ...
# pragma: no cover
if condition:
if condition: # pragma: no cover
# code ignored by coverage
# ...
# code analyzed by coverage
# ...
# pragma: no cover
def function():
def function(): # pragma: no cover
# all function code is ignored by coverage
```

Expand All @@ -109,7 +107,7 @@ If you think the pragma should be used in other parts of your pull request, plea
## Documentation generation

Documentation is built using [sphinx](https://www.sphinx-doc.org/en/master/).
To check its generation, you can wait for all the CI tasks to download the `docs` artifacts, unzip it and open the `index.html` file there with you favorite browser.
To check its generation, you can wait for all the CI tasks to download the `docs` artifacts, unzip it and open the `index.html` file there with you favorite browser.

However, when you are working on documentation (of the project, of the code, etc ...), you can already build and check the website locally :

Expand All @@ -124,7 +122,7 @@ sphinx-build -b html docs/source docs/build/html

Then you can open `docs/build/html/index.html` using you favorite browser and check how your own documentation looks like on the website.

> :bell: **Important** : running all the tests is necessary to generate graphs and images used by the website.
> :bell: **Important** : running all the tests is necessary to generate graphs and images used by the website.
> But you can still generate the website without it: just all images for the tutorials, projects and playgrounds will be missing.
> This approach can be considered for local testing of your contribution when it does not concern parts containing images (_i.e_ project or code documentation).

Expand Down
59 changes: 57 additions & 2 deletions docs/contrib/04_custom_implementations.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,62 @@
# Custom implementation guidelines

... in construction ...
`pySDC` solves (non-)linear ODE of the form

$$
\frac{du}{dt} = f(u, t), \quad u(0)=u_0, \quad t \in [0, T].
$$

where $u$ a vector or scalar, representing one or more variables.

The type of variable, the definition of the right-hand side $f(u,t)$ and initial solution value $u_0$ are defined in a given `Problem` class.

... to be continued ...

## Implementing a custom problem class

Any problem class inherit from the same base class, that is (currently) the `ptype` class from the module `pySDC.code.Problem`.
Each custom problem should inherit from this base class, like for the following example template :

- right-hand side $f(u,t)=\lambda u + ct$ with
- $\lambda$ one or more complex values (in vector form)
- $c$ one scalar value

```python
import numpy as np

from pySDC.core.Problem import ptype
from pySDC.core.Errors import ProblemError
from pySDC.implementations.datatype_classes.mesh import mesh

class MyCustomProblem(ptype):
# 1) Provide datatype class as class attribute
dtype_u = mesh # -> used for u values
dtype_f = mesh # -> used for f(u,t) values

# 2) Define constructor
def __init__(self, lam, c):

# Store lambda values into a numpy array (with copy) + check
lam = np.array(lam)
if len(lam.shape) > 1:
raise ProblemError(f"lambda values must be given as 1D vector, got shape {lam.shape}")

# Call base constructor
super().__init__(init=(lam.size, None, lam.dtype))

# Register parameters
self._makeAttributeAndRegister('lam', 'c', localVars=locals(), readOnly=True)

# 3) Define RHS function
def eval_f(self, u, t):
f = self.f_init # Generate new datatype to store result
f[:] = self.lam*u + self.c*t # Compute RHS value
return f
```

:bell: The `_makeAttributeAndRegister` automatically add `lam` and `c`, and register them in a list of parameters that are printed in the outputs of `pySDC`.
If you set `readOnly=True`, then those parameters cannot be changed after initializing the problem (if not specifies, use `readOnly=False`).

:arrow_left: [Back to Naming Conventions](./03_naming_conventions.md) ---
:arrow_up: [Contributing Summary](./../../CONTRIBUTING.md) ---
:arrow_right: [Next to a cute picture of cat](https://www.vecteezy.com/photo/2098203-silver-tabby-cat-sitting-on-green-background)
:arrow_right: [Next to Documenting Code](./05_documenting_code.md)
93 changes: 93 additions & 0 deletions docs/contrib/05_documenting_code.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
# Documenting Code

When developing a new class or function, or improving current classes in `pySDC`, adding Python docstring to document the code is important, in particular :

- to help developer understanding how classes and functions work when reading the code
- to help user understanding how classes and functions work when reading the [documentation](https://parallel-in-time.org/pySDC/#api-documentation)

`pySDC` follows the [NumPy Style Python Docstring](https://numpydoc.readthedocs.io/en/latest/format.html), below is simplified example
for a class and a function :

> :bell: Don't document the `init` function, but rather the class itself. Also describe parameters (given to the `__init__`) and attributes (stored into the class) separately.

```python
class LagrangeApproximation(object):
r"""
Class approximating any function on a given set of points using barycentric
Lagrange interpolation.

Let note :math:`(t_j)_{0\leq j<n}` the set of points, then any scalar
function :math:`f` can be approximated by the barycentric formula :

.. math::
p(x) =
\frac{\displaystyle \sum_{j=0}^{n-1}\frac{w_j}{x-x_j}f_j}
{\displaystyle \sum_{j=0}^{n-1}\frac{w_j}{x-x_j}},

where :math:`f_j=f(t_j)` and

.. math::
w_j = \frac{1}{\prod_{k\neq j}(x_j-x_k)}

are the barycentric weights.
The theory and implementation is inspired from `this paper <http://dx.doi.org/10.1137/S0036144502417715>`_.

Parameters
----------
points : list, tuple or np.1darray
The given interpolation points, no specific scaling, but must be
ordered in increasing order.

Attributes
----------
points : np.1darray
The interpolating points
weights : np.1darray
The associated barycentric weights
"""

def __init__(self, points):
pass # Implementation ...

@property
def n(self):
"""Number of points"""
pass # Implementation ...

def getInterpolationMatrix(self, times):
r"""
Compute the interpolation matrix for a given set of discrete "time"
points.

For instance, if we note :math:`\vec{f}` the vector containing the
:math:`f_j=f(t_j)` values, and :math:`(\tau_m)_{0\leq m<M}`
the "time" points where to interpolate.
Then :math:`I[\vec{f}]`, the vector containing the interpolated
:math:`f(\tau_m)` values, can be obtained using :

.. math::
I[\vec{f}] = P_{Inter} \vec{f},

where :math:`P_{Inter}` is the interpolation matrix returned by this
method.

Parameters
----------
times : list-like or np.1darray
The discrete "time" points where to interpolate the function.

Returns
-------
PInter : np.2darray(M, n)
The interpolation matrix, with :math:`M` rows (size of the **times**
parameter) and :math:`n` columns.
"""
pass # Implementation ...

```

A more detailed example is given [here ...](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_numpy.html)

:arrow_left: [Back to custom implementations](./04_custom_implementations.md) ---
:arrow_up: [Contributing Summary](./../../CONTRIBUTING.md) ---
:arrow_right: [Next to a cute picture of cat](https://www.vecteezy.com/photo/2098203-silver-tabby-cat-sitting-on-green-background)
2 changes: 0 additions & 2 deletions pySDC/core/Collocation.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
import logging

import numpy as np
import scipy.interpolate as intpl

from pySDC.core.Nodes import NodesGenerator
from pySDC.core.Errors import CollocationError
Expand Down
75 changes: 75 additions & 0 deletions pySDC/core/Common.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Description
-----------

Module containing utility classe(s) from which inherit some of the pySDC base
classes.
"""
from pySDC.core.Errors import ReadOnlyError


class _MetaRegisterParams(type):
"""Metaclass for RegisterParams base class"""

def __new__(cls, name, bases, dct):
obj = super().__new__(cls, name, bases, dct)
obj._parNamesReadOnly = set()
obj._parNames = set()
return obj


class RegisterParams(metaclass=_MetaRegisterParams):
"""
Base class to register parameters.

Attributes
----------
params : dict (property)
Dictionary containing names and values of registered parameters.
_parNames : set of str
Names of all the registered parameters.
_parNamesReadOnly : set of str
Names of all the parameters registered as read-only.
"""

def _makeAttributeAndRegister(self, *names, localVars=None, readOnly=False):
"""
Register a list of attribute name as parameters of the class.

Parameters
----------
*names : list of str
The name of the parameters to be registered (should be class attributes).
localVars : dict
Dictionary containing key=names and values=paramValues for each
parNames given in names. Can be provided, for instance, using
`locals()` built-in dictionary. MUST BE provided as soon as
names contains anything.
readOnly : bool, optional
Wether or not store the parameters as read-only attributes
"""
if len(names) > 1 and localVars is None:
raise ValueError("a dictionary must be provided in localVars with parameters values")
# Set parameters as attributes
for name in names:
try:
super().__setattr__(name, localVars[name])
except KeyError: # pragma: no cover
raise ValueError(f'value for {name} not given in localVars')
# Register as class parameter
if readOnly:
self._parNamesReadOnly = self._parNamesReadOnly.union(names)
else:
self._parNames = self._parNames.union(names)

@property
def params(self):
"""Dictionary containing names and values of registered parameters"""
return {name: getattr(self, name) for name in self._parNamesReadOnly.union(self._parNames)}

def __setattr__(self, name, value):
if name in self._parNamesReadOnly:
raise ReadOnlyError(name)
super().__setattr__(name, value)
11 changes: 5 additions & 6 deletions pySDC/core/Controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -185,12 +185,11 @@ def dump_setup(self, step, controller_params, description):
else:
out += ' %s = %s\n' % (k, v)
out += '--> Problem: %s\n' % L.prob.__class__
for k, v in vars(L.prob.params).items():
if not k.startswith('_'):
if k in description['problem_params']:
out += '--> %s = %s\n' % (k, v)
else:
out += ' %s = %s\n' % (k, v)
for k, v in L.prob.params.items():
if k in description['problem_params']:
out += '--> %s = %s\n' % (k, v)
else:
out += ' %s = %s\n' % (k, v)
out += '--> Data type u: %s\n' % L.prob.dtype_u
out += '--> Data type f: %s\n' % L.prob.dtype_f
out += '--> Sweeper: %s\n' % L.sweep.__class__
Expand Down
9 changes: 9 additions & 0 deletions pySDC/core/Errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,3 +68,12 @@ class ProblemError(Exception):
"""

pass


class ReadOnlyError(Exception): # pragma: no cover
"""
Exception thrown when setting a read-only class attribute
"""

def __init__(self, name):
super().__init__(f'cannot set read-only attribute {name}')
14 changes: 6 additions & 8 deletions pySDC/core/Lagrange.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,12 @@ class LagrangeApproximation(object):
are the barycentric weights.
The theory and implementation is inspired from `this paper <http://dx.doi.org/10.1137/S0036144502417715>`_.

Parameters
----------
points : list, tuple or np.1darray
The given interpolation points, no specific scaling, but must be
ordered in increasing order.

Attributes
----------
points : np.1darray
Expand All @@ -83,14 +89,6 @@ class LagrangeApproximation(object):
"""

def __init__(self, points):
"""

Parameters
----------
points : list, tuple or np.1darray
The given interpolation points, no specific scaling, but must be
ordered in increasing order.
"""
points = np.asarray(points).ravel()

diffs = points[:, None] - points[None, :]
Expand Down
2 changes: 1 addition & 1 deletion pySDC/core/Level.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def __init__(self, problem_class, problem_params, sweeper_class, sweeper_params,

# instantiate sweeper, problem and hooks
self.__sweep = sweeper_class(sweeper_params)
self.__prob = problem_class(problem_params)
self.__prob = problem_class(**problem_params)

# set level parameters and status
self.params = _Pars(level_params)
Expand Down
Loading