-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Prepare branch for the TFP 0.12.0rc4 release #1191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
… during MCMC. The `PreconditionedHamiltonianMonteCarlo` distribution accepts a momentum distribution that it uses to sample momentum from. For example, if you suspect your posterior covariance is `S`, then using a kinetic energy function of `K(p) = 0.5 * (p.T S p)` may improve the performance of Hamiltonian Monte Carlo. This transition kernel is intended to use to estimate a diagonal covariance matrix, by updating the underlying momentum distribution according to an online estimate of the posterior variance. Note that this is *not* a valid MCMC scheme, but the user may use the estimate this transition kernel produces to supply a fixed momentum distribution for sampling. PiperOrigin-RevId: 343530087
…base init. PiperOrigin-RevId: 343541350
Unfortunately, XLA:CPU will optimize away the correction term, making this useless, unless reassociation is disabled (which may make other reductions slower). (We add this env var in the test in the __name__=='__main__' block.) However, TPU and GPU are where we would often want to use this, when resolving differences between large-magnitude `log_prob`s. PiperOrigin-RevId: 343543748
PiperOrigin-RevId: 343545188
PiperOrigin-RevId: 343551796
This adds multipart support to Chain, and introduces Restructure and JointMap bijectors for operations on structured inputs. PiperOrigin-RevId: 343573225
…erarchy. The solution is not beautiful. In order to construct a valid CT subclass *class instance* to pass to the decorator, the subclass needs to have a _type_spec property (else, ABCMeta complains and class creation fails, before the decorator can ever run). So we're forced to implant a trivial _type_spec definition, which is then overwritten by the decorator. This change also refactors the internal logic of the auto_composite_tensor, somewhat substantially, to separate a bit more the AutoCompositeTensorTypeSpec behaviors from the decorator. This change also uses the new behavior in `tfp.experimental.stats`. PiperOrigin-RevId: 343575343
…st `from_stats`. PiperOrigin-RevId: 343881821
When defining the inverse log-det-jacobian in terms of an automatic derivative, as we do here, its gradient requires *second* derivatives of the underlying inverse function. We therefore need to provide annotations for the second derivatives, using the inverse function theorem. It's a bit of a mess to do this in a way that works for both TF and JAX, but it seems to have worked. I'd welcome suggestions for making this neater. PiperOrigin-RevId: 343928308
Currently the HalfNormal distribution uses the default `log_prob(x) = log(prob(x))` implementation, which underflows for large x. Implementing log_prob directly improves stability and matches the implementation of the Normal distribution. I replaced prob because there's no clear reason to keep both: relying on the default `prob(x) = exp(log_prob(x))` implementation is consistent with the Normal distribution, and all tests pass. PiperOrigin-RevId: 343970631
PiperOrigin-RevId: 344104708
…ad of unrolling huge TF graphs. Also delete dynamic shape tests, because it doesn't make sense to test the `from_shape` constructor against the shape not actually being known. Should replace them with dynamic shape tests of `from_example` later. PiperOrigin-RevId: 344124189
PiperOrigin-RevId: 344125780
Subclass `_parameter_properties` methods did take an eps argument in an earlier revision, but in the checked-in version it's just dtypes all the way down. PiperOrigin-RevId: 344274572
PiperOrigin-RevId: 344302481
PiperOrigin-RevId: 344308630
…t-ratio. This is what tfp.mcmc.HMC does. PiperOrigin-RevId: 344309370
… valid correlation matrices in more cases. PiperOrigin-RevId: 344321285
…he Hypothesis-level timeout from the command line. PiperOrigin-RevId: 344326945
PiperOrigin-RevId: 344927383
…formedDistribution`s. PiperOrigin-RevId: 344932582
This is due to travis-ci.org being deprecated. One immediate advantage to the switch is that Github Actions has more powerful machines which aren't shared across projects in an as obvious way as Travis machines were. That said, they're not powerful enough to eschew test sharding altogether. PiperOrigin-RevId: 345051868
This fixes math formatting on tensorflow.org, and gives more descriptive variable names to some variables. PiperOrigin-RevId: 345053833
…SIS=1. Turns out the underlying Hypothesis was "randomizing" by initializing its PRNG from the system time, which is just asking for trouble whenever more than one processor is involved in a test run. PiperOrigin-RevId: 345065008
PiperOrigin-RevId: 345073056
PiperOrigin-RevId: 345102668
PiperOrigin-RevId: 345122450
PiperOrigin-RevId: 345140597
The correct reference should be:
[3] E. Schmidt, über die Auflosung linearer Gleichungen mit unendlich vielen
Unbekannten, Rend. Circ. Mat. Pulermo (1) 25:53-77 (1908).
PiperOrigin-RevId: 345317128
The real issue is illustrated by the test added to joint_distribution_test, where prior to this fix you couldn't actually compute gradients with respect to the samples of joint distributions due to some structure mismaches. The fix is likely extremely overkill, but to the extent that the documentation for `make_sharded_log_prob_parts` doesn't single out lists/tuples as the only supported structure, this seems an acceptable thing to do. PiperOrigin-RevId: 345327288
PiperOrigin-RevId: 345363927
…fd.Sample`. This can substantially increase the precision of accumulation for log_prob evaluations over many independent elements. Note the fast-math caveat for XLA:CPU. PiperOrigin-RevId: 345453227
…iceSampler PiperOrigin-RevId: 345514055
When initializing, the chain bijector checks the inverse_event_ndims of each bijector. In certain cases, this was not defined and threw an exception (see unit test for one case that would trigger this). PiperOrigin-RevId: 345568297
…ility PiperOrigin-RevId: 345602861
…utions. PiperOrigin-RevId: 345679617
…of streaming MCMC. Supports: - Reductions - Streaming (by not tracing) when only the reductions are wanted - Tracing intermediate results of the reductions - Without having to know about `WithReductions` or `finalize` or any of the other machinery that makes this work under the hood. The `sample_fold` driver is now basically redundant, and should probably be removed (or at least rewritten as an alias of `run_kernel`). The original `sample_chain` driver in non-experimental tfp.mcmc is now also basically redundant, and should probably be rewritten as an alias for `run_kernel`, and possibly also deprecated. Some remaining limitations and possible additional future directions in code comments. PiperOrigin-RevId: 345740916
PiperOrigin-RevId: 346137918
…affine bijectors. Specifically, the bijector must either be scalar, or it must be a purely structural transformation (as embodied by the new property `_is_permutation`). PiperOrigin-RevId: 346150970
For numpy, the covariance is divided by N-1. PiperOrigin-RevId: 346203390
PiperOrigin-RevId: 346227340
…nge to PHMC. PiperOrigin-RevId: 346322156
PiperOrigin-RevId: 346338660
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
|
We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google. ℹ️ Googlers: Go here for more info. |
|
We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google. ℹ️ Googlers: Go here for more info. |
|
@googlebot I fixed it. All commits in this PR are cherry-picked from https://github.com/tensorflow/probability/tree/master (or written by me). |
|
A Googler has manually verified that the CLAs look good. (Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.) ℹ️ Googlers: Go here for more info. |
1 similar comment
|
A Googler has manually verified that the CLAs look good. (Googler, please make sure the reason for overriding the CLA status is clearly documented in these comments.) ℹ️ Googlers: Go here for more info. |
No description provided.