Skip to content

Commit

Permalink
Merge branch 'master' of github.com:lbl-camera/fvGP
Browse files Browse the repository at this point in the history
  • Loading branch information
MarcusMNoack committed Dec 5, 2023
2 parents 7695e5e + dafcb6f commit 8ddeae3
Showing 1 changed file with 12 additions and 6 deletions.
18 changes: 12 additions & 6 deletions fvgp/gp.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,9 @@ class GP:
----------
input_space_dim : int
Dimensionality of the input space (D). If the input is non-Euclidean, the input dimensionality will be ignored.
x_data : np.ndarray or set or list
x_data : np.ndarray or list of tuples
The input point positions. Shape (V x D), where D is the `input_space_dim`. If dealing with non-Euclidean inputs
x_data should be a set or iterable, not a numpy array.
x_data should be an iterable, not a numpy array.
y_data : np.ndarray
The values of the data points. Shape (V,1) or (V).
init_hyperparameters : np.ndarray, optional
Expand Down Expand Up @@ -2001,8 +2001,10 @@ def gp_total_correlation(self, x_pred, x_out=None):
###########################################################################
def gp_relative_information_entropy(self, x_pred, x_out=None):
"""
Function to compute KL divergence and therefore relative information entropy
of the prior distribution over-predicted function values and the posterior distribution.
Function to compute the KL divergence and therefore the relative information entropy
of the prior distribution over predicted function values and the posterior distribution.
The value is a reflection of how much information is predicted to be gained
through observing a set of data points at x_pred.
Parameters
----------
Expand Down Expand Up @@ -2033,8 +2035,12 @@ def gp_relative_information_entropy(self, x_pred, x_out=None):
###########################################################################
def gp_relative_information_entropy_set(self, x_pred, x_out=None):
"""
Function to compute KL divergence and therefore relative information entropy
if the prior distribution over-predicted function values and the posterior distribution.
Function to compute the KL divergence and therefore the relative information entropy
of the prior distribution over predicted function values and the posterior distribution.
The value is a reflection of how much information is predicted to be gained
through observing each data point in x_pred separately, not all
at once as in `gp_relative_information_entrop`.
Parameters
----------
Expand Down

0 comments on commit 8ddeae3

Please sign in to comment.