New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Penalizing parameters for being too small? #168

Open
DataNeel opened this Issue Feb 9, 2018 · 2 comments

Comments

Projects
None yet
2 participants
@DataNeel

DataNeel commented Feb 9, 2018

I realize that it is typical to penalize model coefficients/parameters for being too large. When training the CLV models implemented by this library, would it ever make sense to penalize coefficients for being too small in addition to penalizing them for being too large?

In my experimentation, I've come across instances where the distribution parameters end up being very small, and it can result in wacky values for the probability that a customer is alive. It almost seems like it would be valuable to make sure that these parameters fall within an ideal range rather than only preventing them from being too large. Perhaps somebody who knows more can tell me why that would be ill-advised.

@rhydomako

This comment has been minimized.

Contributor

rhydomako commented Feb 11, 2018

Hi @DataNeel,

It sounds like you have some prior information about valid parameter ranges for your fits. This information is super useful for making sure your predictions make sense, but it can be a bit tricky to incorporate into the MLE optimization framework that lifetimes uses.

I suggest that you might take a look at some MCMC approaches. For example https://github.com/datascienceinc/pydata-seattle-2017/blob/master/lifetime-value/pareto-nbd.ipynb implements the Pareto-NBD model using pymc3. The advantage here is that you would have the flexibility to specify prior distributions for your model parameters which deemphasize small parameter values.

@DataNeel

This comment has been minimized.

DataNeel commented Feb 12, 2018

Thanks! I'll take a closer look at that link.

I'd still love to hear any related thoughts or experiences, if people have them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment