Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added parser for inputting sparse jacobian #1257

Draft
wants to merge 56 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
068d6f2
Initial work
Letizia97 Feb 28, 2024
4f06d4f
Fix error with self.method .
Letizia97 Feb 28, 2024
e78f85a
Add sparse option to analytic jacobian.
Letizia97 Mar 1, 2024
8610768
Add docs.
Letizia97 Mar 4, 2024
e9eb026
Add "sparse" to valid options for analytic jacobian.
Letizia97 Mar 4, 2024
69d908e
Change matmul to dot.
Letizia97 Mar 4, 2024
f6c8d3d
Improve docstrings and fix linting.
Letizia97 Mar 4, 2024
4d86dfd
Fix docs.
Letizia97 Mar 5, 2024
89f0c9f
Fix error due to self.method.
Letizia97 Mar 5, 2024
31041fc
Improve prob def file for sparse jac and update docs.
Letizia97 Mar 7, 2024
c367a7d
Remove comment.
Letizia97 Mar 10, 2024
8ba35fb
Add tests written so far.
Letizia97 Mar 10, 2024
6324c36
Update key of dict to reflect latest changes
Letizia97 Mar 11, 2024
7b49d45
Introduce dense jac in fitbenchmark_parser.
Letizia97 Mar 11, 2024
f1a41bf
Add parser tests written so far.
Letizia97 Mar 11, 2024
1ef1631
Remove blank lines.
Letizia97 Mar 12, 2024
26b794d
Merge branch 'master' into 1208-Sparse_jacobian
Letizia97 Mar 13, 2024
a7b9857
Fix docs.
Letizia97 Mar 13, 2024
21ba636
Fix tests.
Letizia97 Mar 13, 2024
57e56e7
Improve typing specification.
Letizia97 Mar 13, 2024
2734d82
Remove reminder.
Letizia97 Mar 13, 2024
cd3576d
Revert unnecessary changes
Letizia97 Mar 13, 2024
24e67d8
Fix funcs for turn jac/sparse_jac into callable.
Letizia97 Mar 14, 2024
5bdb9d4
Improve names/docstrings.
Letizia97 Mar 14, 2024
fc07933
Improve/simplify tests.
Letizia97 Mar 14, 2024
ea09c80
Fix conflict due to Mantid jacobian not being callable.
Letizia97 Mar 14, 2024
a02ef12
Fix conflict due to Mantid jacobian not being callable, again.
Letizia97 Mar 14, 2024
5f122ca
When a jac function is provided by the user, then that should have pr…
Letizia97 Mar 14, 2024
2b7705a
Add tests for Mantid jac.
Letizia97 Mar 14, 2024
7d72d15
Test case where jac line present but no sparse_func.
Letizia97 Mar 14, 2024
7bbaf3a
Fix docstrings.
Letizia97 Mar 14, 2024
e5be290
Get MantidDevParser to work with current code.
Letizia97 Mar 14, 2024
d9c18e9
Fix test.
Letizia97 Mar 14, 2024
aed6d47
Fix linting.
Letizia97 Mar 15, 2024
de1b864
Fix linting.
Letizia97 Mar 15, 2024
fb51245
Update cutest parser.
Letizia97 Mar 15, 2024
85441e1
Reduce repetition.
Letizia97 Mar 15, 2024
44d0be0
Improve docs.
Letizia97 Mar 15, 2024
854b3e6
Update options_template.ini
Letizia97 Mar 15, 2024
49411ad
Remove comment.
Letizia97 Mar 15, 2024
d99b940
Refactoring to improve readability / runtime (caching).
Letizia97 Mar 15, 2024
421c771
Improve tests readability.
Letizia97 Mar 15, 2024
417e80b
Fix test.
Letizia97 Mar 15, 2024
ce5139b
Refactoring tests.
Letizia97 Mar 15, 2024
63f6a67
Simply tests.
Letizia97 Mar 15, 2024
747b527
Fix linting.
Letizia97 Mar 15, 2024
1f14f3f
Remove old comment
Letizia97 Mar 18, 2024
87017b0
Update jacobian_evaluations.json for consistency
Letizia97 Mar 18, 2024
527ab38
Update fitbenchmarking/utils/exceptions.py
Letizia97 Apr 10, 2024
11ae675
Update docs.
Letizia97 Apr 10, 2024
ef2d884
Fix docs.
Letizia97 Apr 10, 2024
0cb8a1a
Refactor tests.
Letizia97 Apr 10, 2024
474b1b2
Fix test.
Letizia97 Apr 10, 2024
7d17aa2
Add decorator.
Letizia97 Apr 11, 2024
66c3eaa
Refactor tests.
Letizia97 Apr 11, 2024
9dc08e2
Merge branch 'master' into 1208-Sparse_jacobian
Letizia97 Apr 11, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
14 changes: 9 additions & 5 deletions docs/source/users/options/jacobian_option.rst
Expand Up @@ -9,13 +9,16 @@ The Jacobian section allows you to control which methods for computing Jacobians
Analytic (:code:`analytic`)
---------------------------

Analytic Jacobians can only be used for specific :ref:`problem_def`.
Currently the supported formats are CUTEst, Mantid, and NIST.
The only option is:
Functions for analytic Jacobians are available for specific :ref:`problem_def`. Specifically, dense jacobian functions are
available for cutest, NIST and Mantid, while sparse jacobians are available for cutest only. However, the user can specify custom
jacobian functions for any :ref:`problem_def` except from NIST and cutest, through a problem definition file. They can specify a dense
and a sparse jacobian, and select which one to use by the following options:

* ``default`` - use the analytic derivative provided by a supported format.
* ``default`` - use the analytic (dense) derivative provided by a supported format or the dense jacobian function provided by the user.
* ``sparse`` - use a sparse jacobian function, either available or provided by the user, to compute the derivative.

Default is ``default``
Default is ``default``. When dealing with problems of supported formats, if a jacobian function is specified, this will
have priority over the analytic derivative provided by the format.

.. code-block:: rst

Expand All @@ -38,6 +41,7 @@ options are:
* ``2-point`` - use the first order accuracy forward or backward difference.
* ``3-point`` - use central difference in interior points and the second order accuracy forward or backward difference near the boundary.
* ``cs`` - use a complex-step finite difference scheme. This assumes that the user function is real-valued and can be analytically continued to the complex plane. Otherwise, produces bogus results.
* ``2-point_sparse`` - use 2-point, with a sparsity pattern.

Default is ``2-point``

Expand Down
32 changes: 27 additions & 5 deletions docs/source/users/problem_definition_files/native.rst
Expand Up @@ -25,7 +25,7 @@ keys are described below:

software
Either 'IVP', 'Mantid', 'SasView', or 'Horace' (case insensitive).

This defines whether to use an IVP format, Mantid, or SasView to generate the model.
The 'Mantid' software also supports Mantid's MultiFit functionality, which
requires the parameters listed here to be defined slightly differently.
Expand Down Expand Up @@ -72,7 +72,7 @@ input_file
``examples/benchmark_problems/Data_Assimilation/data_files/lorentz.txt``

plot_scale
The scale of the x and y axis for the plots. The options are 'loglog', 'logy', 'logx' and 'linear'. If this
The scale of the x and y axis for the plots. The options are 'loglog', 'logy', 'logx' and 'linear'. If this
is not set it will default to 'linear'.

function
Expand Down Expand Up @@ -121,10 +121,32 @@ function

SASView functions can be any of
`these <http://www.sasview.org/docs/user/qtgui/Perspectives/Fitting/models/index.html>`__.

**Horace**

The Horace functions are defined here :ref:`horace_format`
The Horace functions are defined here :ref:`horace_format` .

jacobian
This is to define a dense jacobian function, or a sparse jacobian function, or both.
The parser for this function allows the user to define ``g`` in the following:

.. math:: \nabla_p f(x, *args) = g(x, *args)

To do this we use a python module, where the user can define a dense jacobian function
and a sparse jacobian function (or just one of the two). As in the above formula,
both functions can take the following arguments:

- *x* (np.array): A value for x to evaluate at
- *\*args* (floats): The parameters to fit

To link to this functions we use a string with the following parameters:

- *module*: The path to the module
- *dense_func*: The name of the dense jacobian function within the module
- *sparse_func*: The name of the sparse jacobian function within the module

The sparse jacobian function provided must return a matrix in sparse format
(e.g. coo, csr, crs), otherwise an error will be thrown.

fit_ranges
This specifies the region to be fit.
Expand All @@ -133,7 +155,7 @@ fit_ranges
is the minimum in the range and the second is the maximum.

parameter_ranges
An optional setting which specifies upper and lower bounds for
An optional setting which specifies upper and lower bounds for
parameters in the problem.

Similarly to ``fit_ranges``, it takes the form where the first number
Expand Down
4 changes: 2 additions & 2 deletions fitbenchmarking/cost_func/hellinger_nlls_cost_func.py
@@ -1,7 +1,7 @@
"""
Implements the root non-linear least squares cost function
"""
from numpy import array, matmul, ravel, sqrt
from numpy import array, ravel, sqrt

from fitbenchmarking.cost_func.nlls_base_cost_func import BaseNLLSCostFunc
from fitbenchmarking.utils.exceptions import (CostFuncError,
Expand Down Expand Up @@ -85,7 +85,7 @@ def hes_res(self, params, **kwargs):

for i in range(len(x)):
jac_i = array([jac[i]])
hes[:, :, i] = matmul(jac_i.T, jac_i) / (4 * f[i] ** (3/2)) \
hes[:, :, i] = jac_i.T.dot(jac_i) / (4 * f[i] ** (3/2)) \
- hes[:, :, i] / (2 * f[i] ** (1/2))
return hes, self.jac_res(params, **kwargs)

Expand Down
2 changes: 1 addition & 1 deletion fitbenchmarking/cost_func/nlls_base_cost_func.py
Expand Up @@ -83,7 +83,7 @@ def jac_cost(self, params, **kwargs):
r = self.eval_r(params, **kwargs)
J = self.jac_res(params, **kwargs)

return 2.0 * matmul(J.T, r)
return 2.0 * J.T.dot(r)

def hes_cost(self, params, **kwargs):
"""
Expand Down
2 changes: 1 addition & 1 deletion fitbenchmarking/cost_func/poisson_cost_func.py
Expand Up @@ -118,7 +118,7 @@ def hes_res(self, params, **kwargs):
for i in range(len(x)):
jac_i = np.array([jac[i]])
hes[:, :, i] = hes[:, :, i] - y[i] / f[i] * \
(hes[:, :, i] - np.matmul(jac_i.T, jac_i) / f[i])
(hes[:, :, i] - jac_i.T.dot(jac_i) / f[i])
return hes, self.jac_res(params, **kwargs)

def hes_cost(self, params, **kwargs):
Expand Down
22 changes: 21 additions & 1 deletion fitbenchmarking/jacobian/analytic_jacobian.py
@@ -1,8 +1,12 @@
"""
Module which acts as a analytic Jacobian calculator
"""
from scipy.sparse import issparse

from fitbenchmarking.jacobian.base_jacobian import Jacobian
from fitbenchmarking.utils.exceptions import NoJacobianError
from fitbenchmarking.utils.exceptions import (NoJacobianError,
NoSparseJacobianError,
SparseJacobianIsDenseError)


class Analytic(Jacobian):
Expand All @@ -27,6 +31,22 @@ def eval(self, params, **kwargs):
:rtype: numpy array
"""
x = kwargs.get("x", self.problem.data_x)

if self.method == "sparse":

if self.problem.sparse_jacobian is None:
raise NoSparseJacobianError(
f'The selected method is {self.method} but the '
'sparse_jacobian function is None. Please provide a '
'sparse jacobian function.')

sparse_jac = self.problem.sparse_jacobian(x, params)

if not issparse(sparse_jac):
raise SparseJacobianIsDenseError()

return sparse_jac

return self.problem.jacobian(x, params)

def name(self) -> str:
Expand Down
44 changes: 42 additions & 2 deletions fitbenchmarking/jacobian/scipy_jacobian.py
Expand Up @@ -3,8 +3,12 @@
"""
import numpy as np
from scipy.optimize._numdiff import approx_derivative
from scipy.sparse import issparse

from fitbenchmarking.jacobian.base_jacobian import Jacobian
from fitbenchmarking.utils.exceptions import (NoSparseJacobianError,
SparseJacobianIsDenseError)
from fitbenchmarking.utils.log import get_logger


class Scipy(Jacobian):
Expand All @@ -15,6 +19,11 @@ class Scipy(Jacobian):
# Problem formats that are incompatible with certain Scipy Jacobians
INCOMPATIBLE_PROBLEMS = {"cs": ["mantid"]}

def __init__(self, problem):
super().__init__(problem)
self.jac_pattern = None
self.equiv_np_method = None

def eval(self, params, **kwargs):
"""
Evaluates Jacobian of problem.eval_model
Expand All @@ -25,9 +34,40 @@ def eval(self, params, **kwargs):
:return: Approximation of the Jacobian
:rtype: numpy array
"""

self.equiv_np_method = self.method

LOGGER = get_logger()

if self.method.endswith("_sparse"):

if self.problem.sparse_jacobian is None:
raise NoSparseJacobianError(
f'The selected method is {self.method} but the '
'sparse_jacobian function is None. Please provide a '
'sparse jacobian function.')

self.jac_pattern = self.problem.\
sparse_jacobian(self.problem.data_x, params)

if not issparse(self.jac_pattern):
raise SparseJacobianIsDenseError()

# Remove the "_sparse" at the end
self.equiv_np_method = self.method[:-7]

else:
if self.problem.sparse_jacobian is not None:
LOGGER.info('Sparse_jacobian function found, but it '
'will not be used as the selected method is %s.',
self.method)

func = self.problem.eval_model
jac = approx_derivative(func, params, method=self.method,
jac = approx_derivative(func, params,
method=self.equiv_np_method,
rel_step=None,
bounds=(-np.inf, np.inf),
kwargs=kwargs)
kwargs=kwargs,
sparsity=self.jac_pattern)

return jac
91 changes: 91 additions & 0 deletions fitbenchmarking/jacobian/tests/test_jacobian.py
Expand Up @@ -4,6 +4,8 @@
from unittest import TestCase

import numpy as np
from scipy import sparse
from scipy.sparse import issparse

from fitbenchmarking.cost_func.hellinger_nlls_cost_func import \
HellingerNLLSCostFunc
Expand All @@ -18,6 +20,8 @@
from fitbenchmarking.jacobian.scipy_jacobian import Scipy
from fitbenchmarking.parsing.fitting_problem import FittingProblem
from fitbenchmarking.utils import exceptions
from fitbenchmarking.utils.exceptions import (NoSparseJacobianError,
SparseJacobianIsDenseError)
from fitbenchmarking.utils.options import Options


Expand Down Expand Up @@ -54,6 +58,21 @@ def j(x, p):
x * p[0] * np.exp(p[1] * x)))


def j_sparse(x, p):
"""
Sparse Jacobian evaluation

:param x: x data points, defaults to self.data_x
:type x: numpy array, optional
:param p: list of parameters to fit
:type p: list

:return: Sparse Jacobian evaluation
:rtype: scipy.sparse.csr_matrix
"""
return sparse.csr_matrix(j(x, p))


def J(x, p):
"""
Analytic Jacobian residuals evaluation
Expand Down Expand Up @@ -145,6 +164,7 @@ def setUp(self):
self.fitting_problem = FittingProblem(options)
self.fitting_problem.function = f
self.fitting_problem.jacobian = j
self.fitting_problem.sparse_jacobian = j_sparse
self.fitting_problem.data_x = np.array([1, 2, 3, 4, 5])
self.fitting_problem.data_y = np.array([1, 2, 4, 8, 16])
self.cost_func = NLLSCostFunc(self.fitting_problem)
Expand Down Expand Up @@ -173,6 +193,40 @@ def test_scipy_eval(self):
eval_result = jac.eval(params=self.params)
self.assertTrue(np.isclose(self.actual, eval_result).all())

def test_scipy_eval_returns_correct_sparse_jac(self):
"""
Test whether Scipy evaluation is correct when
using sparsity
"""
jac = Scipy(self.cost_func.problem)
jac.method = '2-point_sparse'
eval_result = jac.eval(params=self.params)

self.assertTrue(np.isclose(self.actual, eval_result.todense()).all())
self.assertTrue(issparse(eval_result))

def test_scipy_eval_raises_error_sparsej_dense(self):
"""
Test that Scipy evaluation raises error when
result of sparse jacobian is not in sparse format
"""
self.fitting_problem.sparse_jacobian = j
jac = Scipy(self.cost_func.problem)
jac.method = '2-point_sparse'
with self.assertRaises(SparseJacobianIsDenseError):
jac.eval(params=self.params)

def test_scipy_eval_raises_error_no_sparsej(self):
"""
Test that Scipy evaluation raises error when
result of sparse jacobian is None
"""
self.fitting_problem.sparse_jacobian = None
jac = Scipy(self.cost_func.problem)
jac.method = '2-point_sparse'
with self.assertRaises(NoSparseJacobianError):
jac.eval(params=self.params)

def test_numdifftools_eval(self):
"""
Test whether numdifftools evaluation is correct
Expand All @@ -198,6 +252,43 @@ def test_analytic_cutest_no_errors(self):
actual = J(self.fitting_problem.data_x, self.params)
self.assertTrue(np.isclose(actual, eval_result).all())

def test_analytic_eval_returns_correct_sparse_jac(self):
"""
Test whether analytic jac evaluation is correct
when using sparsity
"""
jac = Analytic(self.cost_func.problem)
jac.method = 'sparse'
self.cost_func.jacobian = jac
eval_result = self.cost_func.jac_res(params=self.params)
actual = J(self.fitting_problem.data_x, self.params)
self.assertTrue(np.isclose(actual, eval_result.todense()).all())
self.assertTrue(issparse(eval_result))

def test_analytic_eval_raises_error_no_sparsej(self):
"""
Test that analytic jac evaluation raises error when
result of sparse jacobian is None
"""
self.fitting_problem.sparse_jacobian = None
jac = Analytic(self.cost_func.problem)
jac.method = 'sparse'
self.cost_func.jacobian = jac
with self.assertRaises(NoSparseJacobianError):
self.cost_func.jac_res(params=self.params)

def test_analytic_eval_raises_error_sparsej_dense(self):
"""
Test that analytic jac evaluation raises error when
result of sparse jacobian is not in sparse format
"""
self.fitting_problem.sparse_jacobian = j
jac = Analytic(self.cost_func.problem)
jac.method = 'sparse'
self.cost_func.jacobian = jac
with self.assertRaises(SparseJacobianIsDenseError):
self.cost_func.jac_res(params=self.params)

def test_analytic_cutest_weighted(self):
"""
Test analytic Jacobian
Expand Down