Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NB support #89

Open
wangxinge opened this issue Aug 10, 2020 · 1 comment
Open

NB support #89

wangxinge opened this issue Aug 10, 2020 · 1 comment
Labels
enhancement New feature or request

Comments

@wangxinge
Copy link

Hi there,
Can I use glm.nb for the regression?
Thank you!

@lindeloev lindeloev added this to the 0.4 Multiple predictors milestone Aug 10, 2020
@lindeloev lindeloev added the enhancement New feature or request label Aug 10, 2020
@lindeloev
Copy link
Owner

lindeloev commented Aug 10, 2020

Negative Binomial regression is not natively supported yet, but I will make it a priority adding it. Specifically:

  • Support for mcp(..., family = negbinomial()) with appropriate default priors.
  • Allow changepoint regression on the shape, e.g., model = list(y ~ 1 + x + shape(1 + x), data = df)
  • Add support for which_y = "shape" in plot(), fitted(), etc.

Until then, should be able to "hack" it by looking at the JAGS code in fit$jags_code for a regular binomial model and replace dbin(y_[i_], N[i_]) with dnegbin(y_[i_], shape) and add an appropriate prior, e.g., for shape ~ dunif(0, 50). Then feed that string into mcp again via mcp(model, data, jags_code = your_updated_code). This won't return posterior samples for shape, unfortunately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants