Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ms quantity distributions #98

Open
wants to merge 30 commits into
base: master
Choose a base branch
from
Open

Ms quantity distributions #98

wants to merge 30 commits into from

Conversation

martinspetlik
Copy link
Collaborator

@martinspetlik martinspetlik commented Aug 25, 2020

MS_quantity branch contains code for approximation of the distribution function using the maximum entropy method and also using BSpline and other stuff which were used for the Diploma Thesis.

Quantity is supported.

Copy link
Collaborator

@jbrezmorf jbrezmorf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Push fresh commits


def _samples(self, i_chunk):
@clearable_cache(custom_key_maker=get_cache_key)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

? why clearable
? can we use max_size=1
? simplification of get_cache_key:

  • just return tuple
  • no need to hash also input quantities as these are unique for self anyway

Copy link
Collaborator

@jbrezmorf jbrezmorf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well done. Just some minor points for improvement.

@@ -77,6 +77,9 @@ def __call__(self, value):
def eval(self, i, value):
return self._eval_all(value, i+1)[:, -1]

def eval_fine_coarse(self, i, value):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

K cemu se toto pouziva? Je to neefektivni pocitat vsechny momenty az do i a pak vybrat jen ten posledni.

@@ -187,14 +200,30 @@ def load_scheduled_samples(self):
def sample_pairs(self):
"""
Sample results split to numpy arrays
:param i_chunk: identifier of chunk
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ma sample_pairs jeste smysl? Zejmena kdyz neni podporovano i_chunk?

:param i_chunk: int, chunk id
:return:
"""
if i_chunk != 0:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To je nejake divne, nema tu byt:
if i_chunk==0:?
Podobne pro memory storage.
Mas nejaky test kde se pocita s vice chunkama?

# cov = (cov_fine - cov_coarse) / self.n_samples
#
# return cov
def estimate_covariance(self, moments_fn, stable=False):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should do the cov matrix estimate through quantities and the estimate_mean algorithm.
The stable summation algorithm tries to reduce arithmetic errors. That can also be an issue of the general estimation algorithm. See, e.g. https://en.wikipedia.org/wiki/Kahan_summation_algorithm

That can have some effect if we try to obtain very precise estimates using millions of samples.

@@ -116,7 +116,10 @@ def calculate(config, seed):
for result in [fine_result, coarse_result]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

General not to synth_simulation.py currently this is just a test mock class and should be in test directory possibly as a simple tutorial. But we can possibly implement a SyntheticSimulation based on given exact distribution.


return self.frame0._copy(*[q for t, q in self.time_frames], operation=max_op)
def __getitem__(self, key):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be reused from quantity or rather from QType descendants. See the note above.

new_shape = np.empty(self._shape)[key].shape

# One selected item is considered to be a scalar QType
if len(new_shape) == 1 and new_shape[0] == 1:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should keep the produced new_shape. Be as close to the numpy behavior as possible.


# Result is also array
if len(new_shape) > 0:
q_type = ArrayType(new_shape, qtype=copy.deepcopy(self._qtype))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we need deep copy of the type?


def custom_copy(quantities):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly make it a private static method of Quantity and use appropriate name. 'custom' says nothing.

@@ -0,0 +1,468 @@
import os
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have to go through the test yet...

@martinspetlik martinspetlik changed the title Ms quantity Ms quantity distributions Jan 22, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants