Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tap --> 0.8 #2027

Closed
wants to merge 6 commits into from
Closed

tap --> 0.8 #2027

wants to merge 6 commits into from

Conversation

JaimeRZP
Copy link
Member

@JaimeRZP JaimeRZP commented Jul 3, 2023

No description provided.

@@ -27,7 +27,7 @@ function initialize_nuts(model::Turing.Model)
proposal = AdvancedHMC.NUTS{AdvancedHMC.MultinomialTS,AdvancedHMC.GeneralisedNoUTurn}(integrator)
adaptor = AdvancedHMC.StanHMCAdaptor(
AdvancedHMC.MassMatrixAdaptor(metric),
AdvancedHMC.StepSizeAdaptor(0.65, integrator)
AdvancedHMC.StepSizeAdaptor(0.8, integrator)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JaimeRZP this shouldn't matter very much; if tests fail for 0.65, try to increase the number of samples.

@codecov
Copy link

codecov bot commented Jul 3, 2023

Codecov Report

Patch and project coverage have no change.

Comparison is base (22cdfeb) 0.00% compared to head (fcfbbfb) 0.00%.

Additional details and impacted files
@@                          Coverage Diff                          @@
##           torfjelde/allow-abstractsampler-draft   #2027   +/-   ##
=====================================================================
  Coverage                                   0.00%   0.00%           
=====================================================================
  Files                                         22      22           
  Lines                                       1462    1464    +2     
=====================================================================
- Misses                                      1462    1464    +2     
Impacted Files Coverage Δ
src/contrib/inference/abstractmcmc.jl 0.00% <0.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@@ -58,6 +59,11 @@ function AbstractMCMC.step(
# Link the varinfo.
f = setvarinfo(f, DynamicPPL.link!!(getvarinfo(f), model))

# Make init_params
if init_params === nothing
init_params = 4 .* rand(rng, LogDensityProblems.dimension(f)) .- 2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this supposed to replicate the default initialization that already exists in DynamicPPL? The motivation is a bit unclear to me since 1) initial parameters will already be set using this approach by the code in DynamicPPL, 2) IIRC init_params is supposed to be given in the original untransformed space of the variables and hence this initialization might produce values outside of the support (in DynamicPPL this initialization is performed in the transformed space where these values are always valid!), and 3) init_params is not used below.

@JaimeRZP JaimeRZP closed this Jul 4, 2023
@JaimeRZP JaimeRZP deleted the TAP_08 branch July 4, 2023 13:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants