Cubic interpolation for triangular grids #1786

Merged
merged 3 commits into from Mar 7, 2013

5 participants

@GBillotey

This is the second and final PR for issue #1521.

It adds a CubicTriInterpolator to interpolate scalar fields defined on triangular grids, and a utility class UniformTriRefiner to perform high-resolution tricontouring by grid refinement.
A utility class TriAnalyzer is also added, which performs basic triangular mesh analysis.

I have included tests for the 3 classes, 3 pylab_examples which describe the functionalities, and have updated sphinx docs, whats_new and CHANGELOG.

Regarding the documentation, I did my best to follow MEP10 ; any comment from MEP10 specialists is highly appreciated.
(I make here the observation that, as of Numpydoc 0.4, a __call__ method is not exposed in the documentation unless explicitly listed. Also refer to numpy/numpy@7a57dc1).

This is my first matplotlib submission; my warmest thanks goes to @ianthomas23 for his very detailed instructions throughout the process. Any remaining mistake is mine.

@pelson
Matplotlib Developers member

This looks like an excellent piece of work. Inevitably, nobody will be able to review this line for line, but it looks well tested, well exampled and from a quick glance, well documented. @ianthomas23 - I know you've been very close to the implementation, but even so I'd love to hear from you regarding this work.

@GBillotey - how much of this work is new vs modifying previously existing functionality? I'd like to get a feel for the potential impact that this could have on current users.

@mdboom - where do we stand on this? Clearly @GBillotey & @ianthomas23 are domain experts in this implementation - personally, I'd be happy to merge this early given how much thought and effort has clearly been put in.

Good stuff guys!

@mdboom
Matplotlib Developers member

It would be great to add a few regression tests (it's probably fine if they are PNG only) perhaps based on these included examples.

@ianthomas23
Matplotlib Developers member

@pelson: I agree that it is an excellent piece of work. @GBillotey started off with some rough code to scratch his particular itch and has evolved it into something that is well organised and robust enough to be included in matplotlib. The examples are particularly good; one especially is more like a mini-tutorial than just an example plot. I have reviewed the code periodically and in my last review it was difficult to find any errors other than typos in the documentation. I would like to check it over one more time just in case, but after that I would be happy to accept it.

I'll answer your question to @GBillotey whilst I am writing. This is all new work, following on from my linear triangular grid interpolator PR of 2 months ago (#1582). There are some changes to LinearTriInterpolator to better structure the code, fix a bug and improve performance, but no change to the functionality or API. There should be no impact to existing users.

@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tests/test_triangulation.py
xs = xs.ravel()
ys = ys.ravel()
- zs = linear_interp(xs, ys)
- assert_array_equal(zs.mask, [True]*16)
+ linear_interp = mtri.LinearTriInterpolator(triang, z)
+ cubic_min_E = mtri.CubicTriInterpolator(triang, z)
+ cubic_geom = mtri.CubicTriInterpolator(triang, z, kind='geom')
+ zs = quad(xs, ys)
+ for interp in (cubic_min_E, cubic_geom):
+ diff_lin = np.abs(linear_interp(xs, ys) - zs)
@ianthomas23
Matplotlib Developers member

This line can go before the previous line.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tests/test_triangulation.py
+
+ # Instantiating a sparse Poisson matrix of size 48 x 48:
+ (n, m) = (12, 4)
+ mat = mtri.triinterpolate._Sparse_Matrix_coo(*poisson_sparse_matrix(n, m))
+ mat.compress_csc()
+ mat_dense = mat.to_dense()
+ # Testing a sparse solve for all 48 basis vector
+ for itest in range(n*m):
+ b = np.zeros(n*m, dtype=np.float64)
+ b[itest] = 1.
+ x, _ = mtri.triinterpolate._cg(A=mat, b=b, x0=np.zeros(n*m),
+ tol=1.e-10)
+ assert_array_almost_equal(np.dot(mat_dense, x), b)
+
+ # 2) Same matrix with inserting 2 rows - cols with null diag terms
+ # (but still linked with the rest of the matrice by extra-diag terms)
@ianthomas23
Matplotlib Developers member

'matrice' should be 'matrix'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tri/triinterpolate.py
"""
- if tri == -1:
- return np.nan
+ # Here we try to deal with the simpliest colinear cases ; a zero
+ # energy and area is impsoed.
@ianthomas23
Matplotlib Developers member

'impsoed' should be 'imposed'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tri/triinterpolate.py
+ col1 = _prod_vectorized(J1, np.expand_dims(tri_dz[:, 1, :], axis=3))
+ col2 = _prod_vectorized(J2, np.expand_dims(tri_dz[:, 2, :], axis=3))
+
+ dfdksi = _to_matrix_vectorized([
+ [col0[:, 0, 0], col1[:, 0, 0], col2[:, 0, 0]],
+ [col0[:, 1, 0], col1[:, 1, 0], col2[:, 1, 0]]])
+ dof[:, 0:7:3] = tri_z
+ dof[:, 1:8:3] = dfdksi[:, 0]
+ dof[:, 2:9:3] = dfdksi[:, 1]
+ return dof
+
+
+class _DOF_estimator_user(_DOF_estimator):
+ """ self.dz is imposed by user / Accounts for scaling if any """
+ def compute_dz(self, **kwargs):
+ (dzdx, dzdy) = kwargs['dz']
@ianthomas23
Matplotlib Developers member

Should there be an explicit test that kwarg dz is defined if kind=='user'? If so it could be here, but perhaps putting it in CubicTriInterpolator.init is better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tri/triinterpolate.py
+ dM_inv = _safe_inv22_vectorized(dM)
+
+ dZ1 = tris_f[:, 1] - tris_f[:, 0]
+ dZ2 = tris_f[:, 2] - tris_f[:, 0]
+ dZ = np.vstack([dZ1, dZ2]).T
+ df = np.empty_like(dZ)
+
+ # With np.einsum : could be ej,eji -> ej
+ df[:, 0] = dZ[:, 0]*dM_inv[:, 0, 0] + dZ[:, 1]*dM_inv[:, 1, 0]
+ df[:, 1] = dZ[:, 0]*dM_inv[:, 0, 1] + dZ[:, 1]*dM_inv[:, 1, 1]
+ return df
+
+
+class _DOF_estimator_min_E(_DOF_estimator_geom):
+ """
+ The 'smothest' approximation, df is computed through global minimization
@ianthomas23
Matplotlib Developers member

'smothest' should be 'smoothest'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tri/triinterpolate.py
+ if err0 < err:
+ # Maybe a good occasion to raise a warning here ?
+ warnings.warn('In TriCubicInterpolator initialization, PCG sparse'
+ + ' solver did not converge after 1000 iterations. '
+ + '`geom` approximation is used instead of `min_E`')
+ Uf = Uf0
+
+ # Building dz from Uf
+ dz = np.empty([self._pts.shape[0], 2], dtype=np.float64)
+ dz[:, 0] = Uf[::2]
+ dz[:, 1] = Uf[1::2]
+ return dz
+
+
+# The following private :class:_Sparse_Matrix_coo and :func:_cg provide
+# a PCG sparse solver for (symetric) elliptic problems.
@ianthomas23
Matplotlib Developers member

'symetric' should be 'symmetric'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23 ianthomas23 commented on an outdated diff Feb 26, 2013
lib/matplotlib/tri/triinterpolate.py
+# tangent plane is also problematic.
+#
+# Implementation:
+# Most of the time, when computing the inverse of a rank-deficient matrix it
+# is safe to simply return the null matrix (which is the implementation in
+# :func:`_safe_inv22_vectorized`). This is because of point 2), itself
+# enforced by:
+# - null area hence null energy in :class:`_DOF_estimator_min_E`
+# - angles close or equal to 0 or np.pi hence null weight in
+# :class:`_DOF_estimator_geom`.
+# Note that the function angle -> weight is continuous and maximum for an
+# angle np.pi/2 (refer to :meth:`compute_geom_weights`)
+# The exception is the computation of barycentric coordinates, which is done
+# by inversion of the *metric* matrix. In this case, we need to compute a set
+# of valid coordinates (1 among numerous possibilities), to ensure point 4).
+# We benefit here from the symetry of metric = J x J.T, which makes it easier
@ianthomas23
Matplotlib Developers member

'symetry' should be 'symmetry'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@GBillotey

I have added 2 PNG regression tests and taken into account @ianthomas23 comments. I will wait a few days before actually pushing the changes, just in case somebody else wants to comment.

tri_smooth_contouring-expected

tri_smooth_gradient-expected

@mdboom
Matplotlib Developers member

Very good. One minor point: for the tests, can you add remove_text=True to the arguments passed to the image_comparison decorator. This will remove the text from the plot (which is unimportant for what's being tested here anyway) and make the test less likely to fail when run on a machine with a different version of freetype.

@NelleV

Could you make sure your code is pep8 and follows the numpydoc conventions? I'll pinpoint a couple of "unpep8ness" in the code.

@NelleV NelleV commented on the diff Mar 1, 2013
lib/matplotlib/tri/trirefine.py
+ interpolated values of the field at the refined triangulation nodes.
+
+ """
+ def __init__(self, triangulation):
+ if not isinstance(triangulation, Triangulation):
+ raise ValueError("Expected a Triangulation object")
+ self._triangulation = triangulation
+
+
+class UniformTriRefiner(TriRefiner):
+ """
+ Uniform mesh refinement by recursive subdivisions.
+
+ Parameters
+ ----------
+ triangulation : :class:`~matplotlib.tri.Triangulation`
@NelleV
NelleV added a note Mar 1, 2013

You can write ~matplotlib.tri.Triangulation directly: sphinx will link automatically to the class.

@pelson
Matplotlib Developers member
pelson added a note Mar 1, 2013

Really? Do you happen to know the rules of this namespacing? I couldn't find it in the docs of http://sphinx-doc.org/domains.html#the-python-domain or http://sphinx-doc.org/domains.html#info-field-lists...

@NelleV
NelleV added a note Mar 1, 2013

I think it's some magic from numpydoc but I'm not sure.

On sklearn, we even have some magic happening in the examples: it links all methods from the source code to the documentation, including matplotlib and numpy methods: http://scikit-learn.org/dev/auto_examples/plot_classification_probability.html
I don't know how we do that, but I find it very cool :)

Isn't it the so-called "default role" of Sphynx > 0.4 http://sphinx-doc.org/config.html#confval-default_role ?

@NelleV : is it a "can" or a "should" ?
I understand from MEP10 that all cross-references will have at some point to be updated "in a semi-automated way" with this format (provided this default role doesn't get ambiguous, though).
Note that I would be happy to do it if it is the recommandation, I am just asking.

@NelleV
NelleV added a note Mar 4, 2013

@GBillotey it is a "can" it terms of sphinx/numpydoc, and a "should" in terms of MEP10, but I think it should be a roadblock for merging this patch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@NelleV NelleV and 1 other commented on an outdated diff Mar 1, 2013
lib/matplotlib/tri/triinterpolate.py
+ Returns the (dense) vector of the diagonal elements.
+ """
+ in_diag = (self.rows == self.cols)
+ diag = np.zeros(min(self.n, self.n), dtype=np.float64) # default 0.
+ diag[self.rows[in_diag]] = self.vals[in_diag]
+ return diag
+
+
+def _cg(A, b, x0=None, tol=1.e-10, maxiter=1000):
+ """
+ Use Preconditioned Conjugate Gradient iteration to solve A x = b
+ A simple Jacobi (diagonal) preconditionner is used.
+
+ Parameters
+ ----------
+ *A*: _Sparse_Matrix_coo
@NelleV
NelleV added a note Mar 1, 2013

This does not follow the numpydoc conventions. I'm not sure it is rendered thought, as it is a private method.

I confirm it is a private function, not exposed in the documentation.

My (maybe too lenient) understanding of the documenting rules was that we were pretty much free regarding the format of such docstrings.
Once again, please advice, I am just asking.

@NelleV
NelleV added a note Mar 4, 2013

I'd rather have everything consistent, but I'm not sure there are any rules concerning those. I'm glad there is documentation here, as most of matplotlib's private methods are not documented at all. I think this is, as is, already better than most of the codebase.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@ianthomas23

Not needed as remove_text=True.

@ianthomas23

Not needed as remove_text=True

@ianthomas23
Matplotlib Developers member

Shouldn't this be 'dz=None' for completeness?

@ianthomas23
Matplotlib Developers member

I am happy with this PR and am ready to merge it, but will wait a couple of days in case there are any final objections. All the comments have been addressed, in particular image tests have been added and the code is sufficiently MEP10 compatible. I have run pep8 on all the source files, and the only issues raised are not enough whitespace around operators and too much whitespace around square brackets when constructing lists/arrays. The former corresponds to situations where pep8 is overzealous in interpreting the standard, and the latter has been done on purpose so that columns of numbers are vertically aligned, which is much more readable than strictly following PEP8.

The Travis failure appears to be a dud.

@pelson
Matplotlib Developers member

Though it's not strictly related to this PR, I've been meaning to ask if you have any idea how much of this work could be applied to spherical triangulation? Is that a completely different beast, or do you think there could be some significant overlap?

Great work!

@GBillotey

@pelson Thanks!

This "plane" interpolator should provide a good starting point for a spherical triangulation interpolator. My feeling is that the quantity of additional code will depend on the typical size of the data you want to interpolate.

  • The implementation relies on a "thin plate model" and would still be suitable if you have a reasonably dense mesh of your sphere. In this case I expect that you could use most of the code as it is or with minor adaptation (especially the _ReducedHCT_Element as it is based on the barycentric coordinates, independent of the local basis).
  • If you want to improve the modelisation for really coarse meshes, then you might consider switching to a "thin shell model" to better account for the curvature of the sphere. In this case there would be much more mathematics involved.

One additional point to consider: is your final goal plotting contours or interpolating.
In the later case you might have to rewrite your own TriFinder class (maybe a wavefront search ?) as I believe the beautiful algorithm used by @ianthomas23 in #1582 will not be applicable.

If you decide to give a try, do not hesitate to email me.

@ianthomas23 ianthomas23 merged commit 709eb31 into matplotlib:master Mar 7, 2013

1 check passed

Details default The Travis build passed
@GBillotey GBillotey deleted the GBillotey:smooth_tri branch Mar 8, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment