Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Quarto GHA Workflow Runner committed Apr 15, 2024
1 parent 6e874a9 commit 590f018
Show file tree
Hide file tree
Showing 5 changed files with 1,019 additions and 984 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
4cde6a5e
9381664f
4 changes: 2 additions & 2 deletions api/Model.html
Original file line number Diff line number Diff line change
Expand Up @@ -579,7 +579,7 @@ <h4 class="anchored" data-anchor-id="parameters-2">Parameters</h4>
<tr class="even">
<td><code>inference_method</code></td>
<td>str</td>
<td>The method to use for fitting the model. By default, <code>"mcmc"</code>. This automatically assigns a MCMC method best suited for each kind of variables, like NUTS for continuous variables and Metropolis for non-binary discrete ones. Alternatively, <code>"vi"</code>, in which case the model will be fitted using variational inference as implemented in PyMC using the <code>fit</code> function. Finally, <code>"laplace"</code>, in which case a Laplace approximation is used and is not recommended other than for pedagogical use. To get a list of JAX based inference methods, call <code>model.backend.inference_methods['bayeux']</code>. This will return a dictionary of the available methods such as <code>blackjax_nuts</code>, <code>numpyro_nuts</code>, among others.</td>
<td>The method to use for fitting the model. By default, <code>"mcmc"</code>. This automatically assigns a MCMC method best suited for each kind of variables, like NUTS for continuous variables and Metropolis for non-binary discrete ones. Alternatively, <code>"vi"</code>, in which case the model will be fitted using variational inference as implemented in PyMC using the <code>fit</code> function. Finally, <code>"laplace"</code>, in which case a Laplace approximation is used and is not recommended other than for pedagogical use. To get a list of JAX based inference methods, call <code>bmb.inference_methods.names['bayeux']</code>. This will return a dictionary of the available methods such as <code>blackjax_nuts</code>, <code>numpyro_nuts</code>, among others.</td>
<td><code>'mcmc'</code></td>
</tr>
<tr class="odd">
Expand Down Expand Up @@ -644,7 +644,7 @@ <h4 class="anchored" data-anchor-id="returns-2">Returns</h4>
<td></td>
</tr>
<tr class="odd">
<td><code>model.backend.inference_methods\['bayeux'\]\['mcmc\]</code>.</td>
<td><code>bmb.inference_methods.names\['bayeux'\]\['mcmc\]</code>.</td>
<td></td>
</tr>
<tr class="even">
Expand Down
4 changes: 2 additions & 2 deletions api/Model.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ Fit the model using PyMC.
| `discard_tuned_samples` | bool | Whether to discard posterior samples of the tune interval. Defaults to ``True``. | `True` |
| `omit_offsets` | bool | Omits offset terms in the ``InferenceData`` object returned when the model includes group specific effects. Defaults to ``True``. | `True` |
| `include_mean` | bool | Compute the posterior of the mean response. Defaults to ``False``. | `False` |
| `inference_method` | str | The method to use for fitting the model. By default, ``"mcmc"``. This automatically assigns a MCMC method best suited for each kind of variables, like NUTS for continuous variables and Metropolis for non-binary discrete ones. Alternatively, ``"vi"``, in which case the model will be fitted using variational inference as implemented in PyMC using the ``fit`` function. Finally, ``"laplace"``, in which case a Laplace approximation is used and is not recommended other than for pedagogical use. To get a list of JAX based inference methods, call ``model.backend.inference_methods['bayeux']``. This will return a dictionary of the available methods such as ``blackjax_nuts``, ``numpyro_nuts``, among others. | `'mcmc'` |
| `inference_method` | str | The method to use for fitting the model. By default, ``"mcmc"``. This automatically assigns a MCMC method best suited for each kind of variables, like NUTS for continuous variables and Metropolis for non-binary discrete ones. Alternatively, ``"vi"``, in which case the model will be fitted using variational inference as implemented in PyMC using the ``fit`` function. Finally, ``"laplace"``, in which case a Laplace approximation is used and is not recommended other than for pedagogical use. To get a list of JAX based inference methods, call ``bmb.inference_methods.names['bayeux']``. This will return a dictionary of the available methods such as ``blackjax_nuts``, ``numpyro_nuts``, among others. | `'mcmc'` |
| `init` | str | Initialization method. Defaults to ``"auto"``. The available methods are: * auto: Use ``"jitter+adapt_diag"`` and if this method fails it uses ``"adapt_diag"``. * adapt_diag: Start with a identity mass matrix and then adapt a diagonal based on the variance of the tuning samples. All chains use the test value (usually the prior mean) as starting point. * jitter+adapt_diag: Same as ``"adapt_diag"``, but use test value plus a uniform jitter in [-1, 1] as starting point in each chain. * advi+adapt_diag: Run ADVI and then adapt the resulting diagonal mass matrix based on the sample variance of the tuning samples. * advi+adapt_diag_grad: Run ADVI and then adapt the resulting diagonal mass matrix based on the variance of the gradients during tuning. This is **experimental** and might be removed in a future release. * advi: Run ADVI to estimate posterior mean and diagonal mass matrix. * advi_map: Initialize ADVI with MAP and use MAP as starting point. * map: Use the MAP as starting point. This is strongly discouraged. * adapt_full: Adapt a dense mass matrix using the sample covariances. All chains use the test value (usually the prior mean) as starting point. * jitter+adapt_full: Same as ``"adapt_full"``, but use test value plus a uniform jitter in [-1, 1] as starting point in each chain. | `'auto'` |
| `n_init` | int | Number of initialization iterations. Only works for ``"advi"`` init methods. | `50000` |
| `chains` | int | The number of chains to sample. Running independent chains is important for some convergence statistics and can also reveal multiple modes in the posterior. If ``None``, then set to either ``cores`` or 2, whichever is larger. | `None` |
Expand All @@ -100,7 +100,7 @@ Fit the model using PyMC.
|-----------------------------------------------------------------------------------|---------------|
| An ArviZ ``InferenceData`` instance if inference_method is ``"mcmc"`` (default), | |
| "laplace", or one of the MCMC methods in | |
| ``model.backend.inference_methods\['bayeux'\]\['mcmc\]``. | |
| ``bmb.inference_methods.names\['bayeux'\]\['mcmc\]``. | |
| An ``Approximation`` object if ``"vi"``. | |

### graph { #bambi.Model.graph }
Expand Down
Loading

0 comments on commit 590f018

Please sign in to comment.