You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running calculate() with an mcmc.list object, all of the samples are pushed through the model simultaneously. This is usually the most efficient approach, but can run into memory constraints when the number of samples is large, and the model already had very large data objects.
If there were a max_samples argument (defaulting to Inf), the user could just process the samples in batches instead.
We could also try to catch the out-of memory error, and return a nicer message to the user.
The text was updated successfully, but these errors were encountered:
Doesn't look like catching an out of memory error will be reliable - at least on my mac it crashes R Studio (segfault?) and predicting the memory to be used by an operation doesn't seem feasible - allowing users to tweak it, and using a sensible default value seems like thee best option.
When running
calculate()
with an mcmc.list object, all of the samples are pushed through the model simultaneously. This is usually the most efficient approach, but can run into memory constraints when the number of samples is large, and the model already had very large data objects.If there were a
max_samples
argument (defaulting toInf
), the user could just process the samples in batches instead.We could also try to catch the out-of memory error, and return a nicer message to the user.
The text was updated successfully, but these errors were encountered: