Skip to content

Commit

Permalink
Merge pull request #244 from DuckNrOne/CorrectGramTypo
Browse files Browse the repository at this point in the history
Improved grammar and typo
  • Loading branch information
tBuLi committed Jun 11, 2019
2 parents 4b5932c + 445aec8 commit a9c8815
Showing 1 changed file with 16 additions and 17 deletions.
33 changes: 16 additions & 17 deletions docs/fitting_types.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ order to provide these, it's nicer to use a named model::
Please note that this is the opposite of the convention used by scipy's
:func:`~scipy.optimize.curve_fit`. Looking through their mailing list this
seems to have been implemented the opposite way for historical reasons, and
was understandably never changed so as not to loose backwards compatibility.
was understandably never changed so as not to lose backwards compatibility.
Since this is a new project, we don't have that problem.

.. _constrained-leastsq:
Expand Down Expand Up @@ -119,7 +119,7 @@ As always, bounds on parameters and even constraints are supported.

Minimize/Maximize
-----------------
Minimize or Maximize a model subject to bounds and/or constraints. As an example
Minimize or Maximize a model subject to bounds and/or constraints. As an example,
I present an example from the
`scipy docs <https://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html>`_.

Expand All @@ -131,12 +131,12 @@ Suppose we want to maximize the following function:
Subject to the following constraints:

.. math::
.. math::
x^3 - y = 0
.. math::
y - 1 >= 0
y - 1 >= 0
In SciPy code the following lines are needed::

Expand Down Expand Up @@ -234,11 +234,11 @@ that since we don't know the concentration of B, we explicitly set ``b=None``
when calling :class:`~symfit.core.fit.Fit` so it will be ignored.

.. warning:: Fitting to ODEs is extremely difficult from an algorithmic point
of view, since these systems are usually very sensitive to the parameters.
of view since these systems are usually very sensitive to the parameters.
Using (very) good initial guesses for the parameters and initial values is
critical.

Upon every iteration of performing the fit the ODEModel is integrated again from
Upon every iteration of performing the fit, the ODEModel is integrated again from
the initial point using the new guesses for the parameters.

We can plot it just like always::
Expand Down Expand Up @@ -316,14 +316,14 @@ More common examples, such as dampened harmonic oscillators also work as expecte
D(v, t): a,
}
ode_model = ODEModel(model_dict, initial={t: 0, v: 0, x: 1})

# Let's create some data...
times = np.linspace(0, 15, 150)
data = ode_model(times, k=11, gamma=0.9, m=m.value).x
# ... and add some noise to it.
noise = np.random.normal(1, 0.1, data.shape) # 10% error
data *= noise

fit = Fit(ode_model, t=times, x=data)
fit_result = fit.execute()

Expand All @@ -345,7 +345,7 @@ between the fits to the different datasets. The same syntax used for ODE
fitting makes this problem very easy to solve in :mod:`symfit`.

As a simple example, suppose we have two datasets measuring exponential decay,
with the same background, but different amplitude and decay rate.
with the same background, but the different amplitude and decay rate.

.. math::
Expand Down Expand Up @@ -388,15 +388,15 @@ are added squared, ready to be minimized. Unlike in the above example, the
Global Minimization
-------------------
Very often, there are multiple solutions to a fitting (or minimisation)
problem. These are local minima of the objective function. The best solution of
course is the global minimum, but most minimization algorithms will only find a
problem. These are local minima of the objective function. The best solution, of
course, is the global minimum, but most minimization algorithms will only find a
local minimum, and thus the answer you get will depend on the initial values of
your parameters. This can be incredibly annoying if you have no further
knowledge about your system.

Luckily, global minimizers exist which are not influenced by the initial
guesses for your parameters. In symfit, two such algorithms from :mod:`scipy`
have been wrapped for this pourpose. Firstly, the
have been wrapped for this purpose. Firstly, the
:func:`~scipy.optimize.differential_evolution` algorithm from :mod:`scipy` is
wrapped as :class:`~symfit.core.minimizers.DifferentialEvolution`. Secondly,
the :func:`~scipy.optimize.basinhopping` algorithm is available as
Expand All @@ -417,7 +417,7 @@ just tell :class:`~symfit.core.fit.Fit`::

However, due to how this algorithm works, it's not great at finding the exact
minimum (but it will find it if given enough time). You can work around this by
"chaining" minimizers: first run a global minimization to (hopefully) get close
"chaining" minimizers: first, run a global minimization to (hopefully) get close
to your answer, and then polish it off using a local minimizer::

fit = Fit(model, minimizer=[DifferentialEvolution, BFGS])
Expand All @@ -426,12 +426,12 @@ to your answer, and then polish it off using a local minimizer::
Global minimizers such as differential evolution and basin-hopping are
rather sensitive to their hyperparameters. You might
need to play with those to get appropriate results, e.g.::

fit.execute(DifferentialEvolution={'popsize': 20, 'recombination': 0.9})

.. note::
There is no way to guarantee that the minimum found is actually the global
minimum. Unfortunately there is no way around this. Therefore, you should
minimum. Unfortunately, there is no way around this. Therefore, you should
always critically inspect the results.

Constrained Basin-Hopping
Expand Down Expand Up @@ -512,10 +512,9 @@ What if the model is unnamed?
Then you'll have to use the ordering. Variables throughout :mod:`symfit`'s
objects are internally ordered in the following way: first independent
variables, then dependent variables, then sigma variables, and lastly
parameters when applicable. Within each group alphabetical ordering applies.
parameters when applicable. Within each group, alphabetical ordering applies.

It is therefore always possible to assign data to variables in an unambiguous
way using this ordering. For example::

fit = Fit(model, x_data, y_data, sigma_y_data)

0 comments on commit a9c8815

Please sign in to comment.