Skip to content

Commit

Permalink
Improve documentation.
Browse files Browse the repository at this point in the history
  • Loading branch information
jgosmann committed Mar 30, 2014
1 parent 533f198 commit 6fedfd1
Show file tree
Hide file tree
Showing 8 changed files with 47 additions and 39 deletions.
8 changes: 5 additions & 3 deletions README.rst
Expand Up @@ -7,18 +7,20 @@ Overview
--------

GopPy (Gaussian Online Processes for Python) is a pure Python module providing
a Gaussian process implementation which allows to efficiently add new data
online. I wrote this module because all other Python implementations I know did
a Gaussian process implementation which allows to add new data efficiently
online. I wrote this module because all other Python implementations I knew did
not support efficient online updates.

The features include:
The feature list:

* `scikit-learn <http://scikit-learn.org>`_ compatible interface.
* Efficient online updates.
* Prediction of first order derivatives.
* Estimation of the log likelihood and its derivative.
* Well documented.
* `Good test coverage. <https://coveralls.io/r/jgosmann/goppy>`_
* Supports Python 2.6, 2.7, 3.2, and 3.3. Later versions are likely to work as
well.
* MIT license.

Documentation
Expand Down
8 changes: 5 additions & 3 deletions doc/index.rst
Expand Up @@ -7,18 +7,20 @@ Welcome to GopPy's documentation!
=================================

GopPy (Gaussian Online Processes for Python) is a pure Python module providing
a Gaussian process implementation which allows to efficiently add new data
online. I wrote this module because all other Python implementations I know did
a Gaussian process implementation which allows to add new data efficiently
online. I wrote this module because all other Python implementations I knew did
not support efficient online updates.

The features include:
The feature list:

* `scikit-learn <http://scikit-learn.org>`_ compatible interface.
* Efficient online updates.
* Prediction of first order derivatives.
* Estimation of the log likelihood and its derivative.
* Well documented.
* `Good test coverage. <https://coveralls.io/r/jgosmann/goppy>`_
* Supports Python 2.6, 2.7, 3.2, and 3.3. Later versions are likely to work as
well.
* :download:`MIT license <../LICENSE>`.

Contents:
Expand Down
7 changes: 5 additions & 2 deletions doc/installation.rst
@@ -1,12 +1,15 @@
Installation
============

This section list the dependencies of GopPy and provides installation
This section lists the dependencies of GopPy and provides installation
instructions. The installation is quite easy as GopPy is a pure Python package.

Requirements
------------

GopPy supports Python 2.6, 2.7, 3.2, and 3.3. Later versions will be likely to
work, too. In addition, the following package will be needed:

* `NumPy <http://www.numpy.org/>`_

If you want to run the unit tests, you will additionally need:
Expand All @@ -23,7 +26,7 @@ If you want to build the documentation, you will need:
Install with pip
----------------

You can easily install GopPy with pip::
You can install GopPy easily with pip::

pip install goppy

Expand Down
14 changes: 7 additions & 7 deletions doc/kernel.rst
Expand Up @@ -2,7 +2,7 @@ Implementing Kernels
====================

Implementing an own kernel is easy. Just create a class which is derived from
:class:`.Kernel` and implement a :attr:`.Kernel.params` attribute and the
:class:`.Kernel`. Then implement the :attr:`.Kernel.params` attribute and the
:meth:`.Kernel.full` method. Here is some example code to get you started::

from goppy import Kernel
Expand All @@ -26,7 +26,7 @@ Implementing an own kernel is easy. Just create a class which is derived from
pass

By implementing :attr:`.Kernel.params` as a property it is possible to access
the parameter as an array (which is needed for the evaluating log likelihood
the parameters as an array (which is needed for evaluating log likelihood
derivatives of Gaussian processes), but also by more expressive names like
``k.param1``.

Expand All @@ -37,10 +37,10 @@ sufficient for the basic functionality when used in conjunction with
:meth:`.Kernel.full` has also be able to return the derivatives of the kernel.
See the documentation of :meth:`.Kernel.full` for more information.

Sometimes only the diagonal of the Gram matrix is needed. This can usually be
more efficiently calculated. Thus, it might be a good idea to add code for this
special case by implementing :meth:`.Kernel.diag`. Otherwise the full Gram
matrix will be calculated, but only the diagonal returned.
Sometimes only the diagonal of the Gram matrix is needed. The diagonal can
usually be calculated more efficiently than evaluating the full Gram matrix and
just using the diagonal. Thus, it might be a good idea to add
code for this special case by implementing :meth:`.Kernel.diag`.

Look at :download:`the source of the kernel module <../goppy/kernel.py>` to see
some complete example implementations of kernels.
some complete implementations of kernels.
7 changes: 4 additions & 3 deletions doc/usage.rst
@@ -1,7 +1,7 @@
Usage
=====

This section gives a small tutorial of the main functions of GopPy. A basic
This section gives a small tutorial of the core functions of GopPy. A basic
familiarity with Gaussian processes is assumed. Otherwise you might want to take
a look at [1]_ first (there is also a free online version).

Expand Down Expand Up @@ -29,21 +29,22 @@ After fitting we can use the Gaussian process to make predictions about the
mean function and obtain the associated uncertainty:

.. literalinclude:: pyplots/usage/fit.py
:lines: 3-

.. plot:: pyplots/usage/fit.py

Adding New Data to a Gaussian Process
-------------------------------------

When further data is obtained this can be added easily to the Gaussian process
When further data is obtained, these can be added easily to the Gaussian process
by using the :meth:`.OnlineGP.add` method:

.. literalinclude:: pyplots/usage/add.py
:lines: 4-

.. plot:: pyplots/usage/add.py

If you would call the :meth:`.OnlineGP.fit` method multiple times, the process
If you called the :meth:`.OnlineGP.fit` method multiple times, the process
would be retrained discarding previous data instead of adding new data. You may
also use :meth:`.OnlineGP.add` for the initial training without ever calling
:meth:`.OnlineGP.fit`.
Expand Down
22 changes: 11 additions & 11 deletions goppy/core.py
Expand Up @@ -12,40 +12,40 @@
class OnlineGP(object):
"""Online Gaussian Process.
Provides a Gaussian process to which further data can be efficiently added
Provides a Gaussian process to which further data can be added efficiently
after the initial training.
Parameters
----------
kernel : kernel object
kernel : :class:`.Kernel`
Covariance function of the Gaussian process.
noise_var : float, optional
The assumed variance of the noise on the training targets.
expected_size : int, optional
The overall expected number of training samples to be added to the
Gaussian process. Setting this parameters can be more efficient memory
reallocations are avoided.
Gaussian process. Setting this parameter can be more efficient as it
may avoid memory reallocations.
buffer_factory : function, optional
Function to call to create buffer arrays for data storage.
Attributes
----------
kernel : kernel object
kernel : :class:`.Kernel`
Covariance function of the Gaussian process.
noise_var : float, optional
noise_var : float
The assumed variance of the noise on the training targets.
x_train : (`N`, `D`) ndarray
The `N` training data inputs of dimension `D`. This will be ``None`` as
long as the Gaussian process has not been trained.
y_train : ndarray
y_train : (`N`, `D`) ndarray
The `N` training data targets of dimension `D`. This will be ``None``
as long as the Gaussian process has not been trained.
inv_chol : (`N`, `N`) ndarray
Inverted lower Cholesky factor of the covariance matrix
(upper triangular matrix). This will be ``None`` as long as the
Gaussian process has not been trained.
trained : bool
Indicates that the Gaussian process has been fitted to some training
Indicates whether the Gaussian process has been fitted to some training
data.
Examples
Expand Down Expand Up @@ -166,7 +166,7 @@ def predict(self, x, what=('mean',)):
r"""Predict with the Gaussian process.
Depending on the values included in the `what` parameter different
predictions will be made:
predictions will be made and returned a dictionary ``res``:
* ``'mean'``: Mean prediction of the Gaussian process of shape (`N`,
`D`).
Expand Down Expand Up @@ -238,8 +238,8 @@ def calc_log_likelihood(self, what=('value',)):
* ``'value'``: The log likelihood of the Gaussian process as scalar.
* ``'derivative'``: Partial derivatives of the log likelihood for each
kernel parameter as array. See the ``params`` property of the used
kernel for the order.
kernel parameter as array. See the ``params`` property of the used
kernel for the order.
Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion goppy/growable.py
Expand Up @@ -21,7 +21,7 @@ class GrowableArray(object):
buffer_shape : int or tuple of int, optional
Initial shape of the buffer to hold the actual data. As long as the
array shape stays below the buffer shape no new memory has to
reallocated.
allocated.
Examples
--------
Expand Down
18 changes: 9 additions & 9 deletions goppy/kernel.py
Expand Up @@ -23,7 +23,7 @@ def full(self, x1, x2, what=('y',)):
r"""Evaluate the kernel for all pairs of `x1` and `x2`.
Depending on the values included in the `what` parameter different
evaluations will be made:
evaluations will be made and returned as a dictionary ``res``:
* ``'y'``: Evaluate the kernel for each pair of `x1` and `x2` resulting
in the Gram matrix.
Expand All @@ -34,8 +34,8 @@ def full(self, x1, x2, what=('y',)):
input data points, and the kernel
:math:`k(\mathtt{x1}, \mathtt{x2})`.
* ``'param_derivatives'``: Evaluate the partial derivatives of the
kernel parameters. ``res['param_derivatives']`` will be a list with
the :math:`i`-th element corresponding to
kernel parameters. ``res['param_derivatives']`` will be a
list with the :math:`i`-th element corresponding to
:math:`\left(\frac{\partial k}{d\theta_i}\right)
\left(\mathtt{x1}, \mathtt{x2}\right)` wherein :math:`\theta_i` is
the :math:`i`-th parameter. The order of the parameters is the same
Expand All @@ -44,7 +44,7 @@ def full(self, x1, x2, what=('y',)):
An implementation of a kernel is not required to provide the
functionality to evaluate ``'derivative'`` and/or
``'param_derivatives'``. In this case the set of available predictions
of a Gaussian Process might be limited. All the GopPy standard kernels
of a Gaussian process might be limited. All the GopPy standard kernels
implement the complete functionality described above.
Parameters
Expand All @@ -68,7 +68,7 @@ def full(self, x1, x2, what=('y',)):
raise NotImplementedError()

def diag(self, x1, x2):
"""Evaluate the kernel only for the diagonal of the resulting matrix.
"""Evaluate the kernel and return only the diagonal of the Gram matrix.
If only the diagonal is needed, this functions may be more efficient
than calculating the full Gram matrix with :func:`full`.
Expand Down Expand Up @@ -117,7 +117,7 @@ def params(self):
"""1d-array of kernel parameters.
The first `D` values are the length scales for each dimension and the
last values is the kernel variance.
last value is the kernel variance.
"""
return np.concatenate((self.lengthscales, (self.variance,)))

Expand Down Expand Up @@ -181,7 +181,7 @@ def params(self):
"""1d-array of kernel parameters.
The first `D` values are the length scales for each dimension and the
last values is the kernel variance.
last value is the kernel variance.
"""
return np.concatenate((self.lengthscales, (self.variance,)))

Expand Down Expand Up @@ -248,7 +248,7 @@ def params(self):
"""1d-array of kernel parameters.
The first `D` values are the length scales for each dimension and the
last values is the kernel variance.
last value is the kernel variance.
"""
return np.concatenate((self.lengthscales, (self.variance,)))

Expand Down Expand Up @@ -320,7 +320,7 @@ def params(self):
"""1d-array of kernel parameters.
The first `D` values are the length scales for each dimension and the
last values is the kernel variance.
last value is the kernel variance.
"""
return np.concatenate((self.lengthscales, (self.variance,)))

Expand Down

0 comments on commit 6fedfd1

Please sign in to comment.