Skip to content

Commit

Permalink
to_dict() and from_dict() functionality for Coregionalize Kernel and …
Browse files Browse the repository at this point in the history
…MixedNoise Likelihood class, appveyor CI resurrected (#951)

This PR adds two main things to GPy:
- to- and from-dict functions for the kernels listed belop
- a fix for the appveyor CI
Please see the squashed commit messages listed below.
Authors: @gehbiszumeis @ppk42 respectively
Reviewer: @ekalosak 

---
* new: added to_dict() method to Coregionalize kernel class

* new: added to_dict() method to MixedNoise likelihood class

* fix: made Y_metadata dict content serializable

* fix: typo

* added additional needed parameters to to_dict() method for Coregionalize kernel + added _build_from_input dict method

* new: added possibility to build MixedNoise likelihood from input_dict

* Y_metadata conversion from serializable to np.array when loading from dict

* fix: rework Y_metadata part for compatibility with unittests !minor

* conda cleanup in appveyors pipeline

* conda clean up after conda update

* conda clean before conda update

* try pinning packages for conda

* revert all conda changes

* conda clean all (not only packages)

* use conda update anaconda

* pin conda package

* pin conda package

* try installing charset-normalizer beforehand

* try to get from conda-forge

* revert all conda changes

* Try to fix the conda update challange.

See: https://community.intel.com/t5/Intel-Distribution-for-Python/Conda-update-Conda-fails/td-p/1126174

It is just a try for a different context/(conda version).

* Still fixing build error on appveyor

I also use a newer miniconda version for greater python versions.

* Update appveyor.yml

Thinking it over it decided to use miniconda38 for all python versions unless python 3.5.

* revert miniconda versioning changes

* adjust GPy version in appveyor.yml

* 1st attempt bring the appveyor build to life again

* #955 fixing ci build on appveyor

After bringing the miniconda env to work again, the wrong matplotlib version was used. This commit should fix that.

* #955 Fix CI build

Freezing numpy and scipy was a bad idea.
I freeze matplotlib  dependend  on the python version only.

* add: built_from_dict method for White Kernel

Co-authored-by: Peter Paul Kiefer <ppk42@users.noreply.github.com>
Co-authored-by: Peter Paul Kiefer <dafisppk@gmail.com>
  • Loading branch information
3 people committed Dec 9, 2021
1 parent 3e19a85 commit bb1bc50
Show file tree
Hide file tree
Showing 7 changed files with 84 additions and 12 deletions.
12 changes: 8 additions & 4 deletions GPy/core/gp.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,9 +134,10 @@ def to_dict(self, save_data=True):
if self.mean_function is not None:
input_dict["mean_function"] = self.mean_function.to_dict()
input_dict["inference_method"] = self.inference_method.to_dict()
#FIXME: Assumes the Y_metadata is serializable. We should create a Metadata class
# TODO: We should create a Metadata class
if self.Y_metadata is not None:
input_dict["Y_metadata"] = self.Y_metadata
# make Y_metadata serializable
input_dict["Y_metadata"] = {k: self.Y_metadata[k].tolist() for k in self.Y_metadata.keys()}
if self.normalizer is not None:
input_dict["normalizer"] = self.normalizer.to_dict()
return input_dict
Expand All @@ -162,9 +163,12 @@ def _format_input_dict(input_dict, data=None):
input_dict["mean_function"] = mean_function
input_dict["inference_method"] = GPy.inference.latent_function_inference.LatentFunctionInference.from_dict(input_dict["inference_method"])

#FIXME: Assumes the Y_metadata is serializable. We should create a Metadata class
# converts Y_metadata from serializable to array. We should create a Metadata class
Y_metadata = input_dict.get("Y_metadata")
input_dict["Y_metadata"] = Y_metadata
if isinstance(Y_metadata, dict):
input_dict["Y_metadata"] = {k: np.array(Y_metadata[k]) for k in Y_metadata.keys()}
else:
input_dict["Y_metadata"] = Y_metadata

normalizer = input_dict.get("normalizer")
if normalizer is not None:
Expand Down
2 changes: 1 addition & 1 deletion GPy/core/parameterization/variational.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def __init__(self, means=None, variances=None, name='latent space', *a, **kw):
self.link_parameters(self.mean, self.variance)
self.num_data, self.input_dim = self.mean.shape
if self.has_uncertain_inputs():
assert self.variance.shape == self.mean.shape, "need one variance per sample and dimenion"
assert self.variance.shape == self.mean.shape, "need one variance per sample and dimension"

def set_gradients(self, grad):
self.mean.gradient, self.variance.gradient = grad
Expand Down
25 changes: 25 additions & 0 deletions GPy/kern/src/coregionalize.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,3 +134,28 @@ def gradients_X(self, dL_dK, X, X2=None):

def gradients_X_diag(self, dL_dKdiag, X):
return np.zeros(X.shape)

def to_dict(self):
"""
Convert the object into a json serializable dictionary.
Note: It uses the private method _save_to_input_dict of the parent.
:return dict: json serializable dictionary containing the needed information to instantiate the object
"""

input_dict = super(Coregionalize, self)._save_to_input_dict()
input_dict["class"] = "GPy.kern.Coregionalize"
# W and kappa must be serializable
input_dict["W"] = self.W.values.tolist()
input_dict["kappa"] = self.kappa.values.tolist()
input_dict["output_dim"] = self.output_dim
return input_dict

@staticmethod
def _build_from_input_dict(kernel_class, input_dict):
useGPU = input_dict.pop('useGPU', None)
# W and kappa must be converted back to numpy arrays
input_dict['W'] = np.array(input_dict['W'])
input_dict['kappa'] = np.array(input_dict['kappa'])
return Coregionalize(**input_dict)
5 changes: 5 additions & 0 deletions GPy/kern/src/static.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,11 @@ def to_dict(self):
input_dict = super(White, self)._save_to_input_dict()
input_dict["class"] = "GPy.kern.White"
return input_dict

@staticmethod
def _build_from_input_dict(kernel_class, input_dict):
useGPU = input_dict.pop('useGPU', None)
return White(**input_dict)

def K(self, X, X2=None):
if X2 is None:
Expand Down
29 changes: 29 additions & 0 deletions GPy/likelihoods/mixed_noise.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,3 +80,32 @@ def samples(self, gp, Y_metadata):
_ysim = np.array([np.random.normal(lik.gp_link.transf(gpj), scale=np.sqrt(lik.variance), size=1) for gpj in gp_filtered.flatten()])
Ysim[flt,:] = _ysim.reshape(n1,N2)
return Ysim

def to_dict(self):
"""
Convert the object into a json serializable dictionary.
Note: It uses the private method _save_to_input_dict of the parent.
:return dict: json serializable dictionary containing the needed information to instantiate the object
"""

# input_dict = super(MixedNoise, self)._save_to_input_dict()
input_dict = {"name": self.name,
"class": "GPy.likelihoods.MixedNoise",
"likelihoods_list": []}
for ii in range(len(self.likelihoods_list)):
input_dict["likelihoods_list"].append(self.likelihoods_list[ii].to_dict())

return input_dict

@staticmethod
def _build_from_input_dict(likelihood_class, input_dict):
import copy
input_dict = copy.deepcopy(input_dict)
# gp_link_dict = input_dict.pop('gp_link_dict')
# import GPy
# gp_link = GPy.likelihoods.link_functions.GPTransformation.from_dict(gp_link_dict)
# input_dict["gp_link"] = gp_link
input_dict['likelihoods_list'] = [Likelihood.from_dict(l) for l in input_dict['likelihoods_list']]
return likelihood_class(**input_dict)
18 changes: 12 additions & 6 deletions appveyor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,23 @@ environment:
secure: 8/ZjXFwtd1S7ixd7PJOpptupKKEDhm2da/q3unabJ00=
COVERALLS_REPO_TOKEN:
secure: d3Luic/ESkGaWnZrvWZTKrzO+xaVwJWaRCEP0F+K/9DQGPSRZsJ/Du5g3s4XF+tS
gpy_version: 1.9.9
gpy_version: 1.10.0
matrix:
- PYTHON_VERSION: 3.5
MINICONDA: C:\Miniconda35-x64
MPL_VERSION: 3.0.0
- PYTHON_VERSION: 3.6
MINICONDA: C:\Miniconda36-x64
MINICONDA: C:\Miniconda3-x64
MPL_VERSION: 3.3.4
- PYTHON_VERSION: 3.7
MINICONDA: C:\Miniconda36-x64
MINICONDA: C:\Miniconda3-x64
MPL_VERSION: 3.3.4
- PYTHON_VERSION: 3.8
MINICONDA: C:\Miniconda36-x64
MINICONDA: C:\Miniconda3-x64
MPL_VERSION: 3.3.4
- PYTHON_VERSION: 3.9
MINICONDA: C:\Miniconda36-x64
MINICONDA: C:\Miniconda3-x64
MPL_VERSION: 3.3.4

#configuration:
# - Debug
Expand All @@ -25,7 +30,8 @@ install:
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
- conda info -a
- "conda create -q -n build-environment python=%PYTHON_VERSION% numpy scipy matplotlib"
# github issue #955: freeze build version of matplotlib
- "conda create -q -n build-environment python=%PYTHON_VERSION% numpy scipy matplotlib=%MPL_VERSION%"
- activate build-environment
# We need wheel installed to build wheels
- python -m pip install wheel
Expand Down
5 changes: 4 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,8 +118,10 @@ def ismac():
ext_mods = []

install_requirements = ['numpy>=1.7', 'six', 'paramz>=0.9.0', 'cython>=0.29']
matplotlib_version = 'matplotlib==3.3.4'
if sys.version_info < (3, 6):
install_requirements += ['scipy>=1.3.0,<1.5.0']
matplotlib_version = 'matplotlib==3.0.0'
else:
install_requirements += ['scipy>=1.3.0']

Expand Down Expand Up @@ -174,7 +176,8 @@ def ismac():
'optional':['mpi4py',
'ipython>=4.0.0',
],
'plotting':['matplotlib >= 3.0',
#matplotlib Version see github issue #955
'plotting':[matplotlib_version,
'plotly >= 1.8.6'],
'notebook':['jupyter_client >= 4.0.6',
'ipywidgets >= 4.0.3',
Expand Down

0 comments on commit bb1bc50

Please sign in to comment.