New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support of ReverseDiff.jl as AD backend #428

Merged
merged 60 commits into from Apr 4, 2018

Conversation

Projects
None yet
2 participants
@xukai92
Collaborator

xukai92 commented Mar 7, 2018

  • Support ReverseDiff.jl as AD backend #428
  • Benchmark files for AISTATS
  • Improve typing
  • Refactor adaptation

TODOs

  • Allow SGLD and SGHMC to use ReverseDiff.jl based gradient function

xukai92 added some commits Jan 18, 2018

@@ -0,0 +1,14 @@
using Turing: VarEstimator, add_sample!, get_var

This comment has been minimized.

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai Unit test for var estimator is here.

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai Unit test for var estimator is here.

Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
@@ -10,7 +11,7 @@ end
sample_momentum(vi::VarInfo, spl::Sampler) = begin
dprintln(2, "sampling momentum...")
randn(length(getranges(vi, spl))) .* spl.info[:wum][:stds]
randn(length(getranges(vi, spl))) ./ spl.info[:wum][:stds]

This comment has been minimized.

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai how to use pre-cond (1)

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai how to use pre-cond (1)

This comment has been minimized.

@yebai

yebai Apr 3, 2018

Collaborator

this is correct

@yebai

yebai Apr 3, 2018

Collaborator

this is correct

Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
@@ -52,35 +79,58 @@ find_H(p::Vector, model::Function, vi::VarInfo, spl::Sampler) = begin
# This can be a result of link/invlink (where expand! is used)
if getlogp(vi) == 0 vi = runmodel(model, vi, spl) end
p_orig = p ./ spl.info[:wum][:stds]
p_orig = p .* spl.info[:wum][:stds]

This comment has been minimized.

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai how to use pre-cond (3)

@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai how to use pre-cond (3)

This comment has been minimized.

@yebai

yebai Apr 3, 2018

Collaborator

this is correct; perhaps we can change p_orig to p_prime

  p_prime = p .* spl.info[:wum][:stds]
  H = dot(p_prime, p_prime) / 2 + realpart(-getlogp(vi))
@yebai

yebai Apr 3, 2018

Collaborator

this is correct; perhaps we can change p_orig to p_prime

  p_prime = p .* spl.info[:wum][:stds]
  H = dot(p_prime, p_prime) / 2 + realpart(-getlogp(vi))
Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
@xukai92

This comment has been minimized.

Show comment
Hide comment
@xukai92

xukai92 Apr 2, 2018

Collaborator

@yebai I tagged you a few places in the code. For adaptation in general, codes are mostly in adatp.jl, with corresponding reference source codes from Stan commented. Also in hmc_core.jl there are codes on how to use the pre-conditioning matrix - I also leave a link at the top of that file for the referred Stan code. Please take a look at them and let me know if there is anything unclear.

Collaborator

xukai92 commented Apr 2, 2018

@yebai I tagged you a few places in the code. For adaptation in general, codes are mostly in adatp.jl, with corresponding reference source codes from Stan commented. Also in hmc_core.jl there are codes on how to use the pre-conditioning matrix - I also leave a link at the top of that file for the referred Stan code. Please take a look at them and let me know if there is anything unclear.

Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
Show outdated Hide outdated src/samplers/support/adapt.jl Outdated
@xukai92

This comment has been minimized.

Show comment
Hide comment
@xukai92

xukai92 Apr 3, 2018

Collaborator

@yebai Anything left to do in this PR?

Collaborator

xukai92 commented Apr 3, 2018

@yebai Anything left to do in this PR?

@yebai yebai merged commit 43017ad into master Apr 4, 2018

0 of 2 checks passed

continuous-integration/appveyor/branch Waiting for AppVeyor build to complete
Details
continuous-integration/travis-ci/push The Travis CI build is in progress
Details

@yebai yebai deleted the reverse-diff branch Aug 18, 2018

yebai added a commit that referenced this pull request Sep 18, 2018

Support of ReverseDiff.jl as AD backend (#428)
* Fix dep log in lad

* Dont send opt res

* Fix VarInfo.show bug

* Fix auto tune

* Change * to .* in leapfrog

* temp fix type

* Disable @suppress_err temporarily

* Fix a dep

* Workable ReverseDiff v0.1 done

* Add RevDiff to REQUIRE

* Fix bug in R-AD

* Fix some bugs

* Fix bugs

* Update test

* ReversedDiff.jl mutable bug fixed

* Any to Real

* update benchmark

* Resolve mem alloc for simplex dist

* Fix bug and improve mem alloc

* Improve implementaion of transformations

* Don't include compile time in benchk

* Resolve slowness caused by use of vi.logp

* Update benchmark files

* Add line to load pickle

* Bugfix with reject

* Using ReverseDiff.jl and unsafe model as default

* Fix bug in test file

* Rename vi.rs to vi.rvs

* Add Naive Bayes model in Turing

* Add NB to travis

* DA works

* Tune init

* Better init

* NB MNIST Stan added

* Improve ad assignment

* Improve ad assignment

* Add Stan SV model

* Improve transform typing

* Finish HMM model

* High dim gauss done

* Benchmakr v2.0 done

* Modulize var estimator and fix transform.jl

* Run with ForwardDiff

* Enable Stan for LDA bench

* Fix a bug in adapt

* Improve some code

* Fix bug in NUTS MH step (#324)

* Add interface for optionally enabling adaption.

* Do not adapt step size when numerical error is caught.

* Fix initial epsilon_bar.

* Fix missing t_valid.

* Drop incorrectly adapted step size when necessary  (#324)

* Edit warning message.

* Small tweaks.

* reset_da ==> restart_da

* address suggested naming

* Samler type for WarmUpManager.paras and notation tweaks.

* Bugfix and adapt_step_size == > adapt_step_size!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment