-
Notifications
You must be signed in to change notification settings - Fork 18
/
likelihood.txt
68 lines (55 loc) · 3.03 KB
/
likelihood.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
============
Likelihood
============
By default the likelihood is determined by the type of transient/data being used.
However, users can choose a different likelihood. We note that there is typically only one `correct` choice of likelihood but
there may be edge cases such as errors in time, or non-detections, or uncertain y errors which requires users to use a different likelihood.
Many different simple to more complicated likelihoods are included in :code:`redback`,
these should cover most of the cases seen in transient data but if not, users can write their own likelihoods.
We encourage users to add such likelihoods to :code:`redback`.
Regular likelihoods
-------------------------
- Gaussian likelihood - general Gaussian likelihood
- GRB Gaussian likelihood - a GRB specific Gaussian likelihood
- Poisson likelihood - For a poisson process
More advanced likelihoods
-------------------------
- Gaussian likelihood with additional noise - When you want to estimate some additional uncertainty on your model
- Gaussian likelihood with uniform x errors - When you have x errors that are bin widths
- Gaussian likelihood with non detections - A general Gaussian likelihood with a upper limits on some data points
- Gaussian likelihood with non detections and quadrature noise - Same as above but with an additional noise source added in quadrature
Write your own likelihood
-------------------------
If you don't like the likelihoods implemented in redback, you can write your own, subclassing the redback likelihood for example,
.. code:: python
class GaussianLikelihoodKnownNoise(redback.Likelihood):
def __init__(self, x, y, sigma, function, kwargs):
"""
A general Gaussian likelihood - the parameters are inferred from the
arguments of function
Parameters
----------
x, y: array_like
The data to analyse
sigma: float
The standard deviation of the noise
function:
The python function to fit to the data. Note, this must take the
dependent variable as its first argument. The other arguments are
will require a prior and will be sampled over (unless a fixed
value is given).
kwargs: dictionary of additional keywords for the model
"""
self.x = x
self.y = y
self.sigma = sigma
self.N = len(x)
self.function = function
# These lines of code infer the parameters from the provided function
parameters = inspect.getargspec(function).args
parameters.pop(0)
super().__init__(parameters=dict.fromkeys(parameters))
def log_likelihood(self):
res = self.y - self.function(self.x, **self.parameters, **self.kwargs)
return -0.5 * (np.sum((res / self.sigma)**2)
+ self.N*np.log(2*np.pi*self.sigma**2))