Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support the MPEC formulation of the problem #10

Open
jeffgortmaker opened this issue May 2, 2018 · 4 comments
Open

Support the MPEC formulation of the problem #10

jeffgortmaker opened this issue May 2, 2018 · 4 comments
Labels
enhancement New feature or request

Comments

@jeffgortmaker
Copy link
Owner

This depends on implementation of Hessian computation from #8.

@jiafengkevinchen
Copy link

Might be sensible to migrate some of the code to autodiff packages like https://github.com/google/jax? It seems to have most of the numpy functionalities, but still looks like an experimental package

@jeffgortmaker
Copy link
Owner Author

I've been eyeing autograd for a while now. If it works out, automatic differentiation should make the code cleaner and easier to extend. Do you know how JAX would differ for this use case? Looks like XLA imposes some constraints but also might speed stuff up.

A good starting point would be replacing some of the simpler derivatives (e.g., Market.compute_xi_by_theta_jacobian) with autograd or JAX functions and comparing performance. I'd be okay with a modest performance hit (especially with smaller datasets where autograd/JAX overhead could be substantial), but extra numerical instability or less transparent error handling (e.g., opaque problems with matrix inversion) could be a deal-breaker.

This is relatively low on my to-do list. I think practitioners will benefit much more from other features than from MPEC. But I'm happy to talk more if you want to take a shot at replacing some functions!

@jiafengkevinchen
Copy link

jiafengkevinchen commented Jul 23, 2019

I've not used JAX/autograd (I think JAX is a juiced-up version of autograd, which seems to be a bit sketchy as a repo), but I've used things like PyTorch which provides fairly painless autograd functionality. My prior is that autograd probably works on a restricted set of numpy functions --- probably most functions have gradients implemented, while things like index assignment or other "discrete"-ish operations might be pain. Given that I'm going to see you in a week, happy to talk in person as well; very interested in contributing to this library!

@jeffgortmaker
Copy link
Owner Author

Gotcha, sounds good -- talk soon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants