Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better optimisation method for Dixon-Coles model #1

Closed
Torvaney opened this issue May 27, 2018 · 5 comments

Comments

Projects
None yet
1 participant
@Torvaney
Copy link
Owner

commented May 27, 2018

Use an optimisation routine that enforces the constraints that make the Dixon-Coles model identifiable.

Ideally this would allow arbitrary constraints to be added to additional predictor variables specified in dixoncoles_ext.

@Torvaney Torvaney self-assigned this May 27, 2018

@Torvaney

This comment has been minimized.

Copy link
Owner Author

commented Jun 17, 2018

I think optim is actually kind of okay (aside from warnings resulting from invalid rho estimates).

Currently I'm thinking a method kind of like this might work well:

  • Specify constraints as a list of formulae. Something like c(x) ~ equal_to(0).
    • Could specify the penalty/barrier method here. Like gte_zero('log_barrier') or something.
  • Convert formulae to a set of penalty functions.
  • Create a new objective function as sum of original and the penalty functions.

I guess there could be some difficulties/annoyances with the penalty weights. These would probably need to be customisable either through some control type function argument (or varargs with!!!) or in the constraint/formula specification (like 0.1 * c(x) ~ equal_to(0)).

Sequential solutions with increasing penalty weights should also be considered. In this case each constraint could return a function that takes a weight and returns a penalty function (or just takes weight as an argument - maybe this is more idiomatic in R?).

@Torvaney

This comment has been minimized.

Copy link
Owner Author

commented Jun 17, 2018

Another thought: adding constraints to team parameters could be annoying.

One potential solution would be to look for parameters named by off___* or def___*. This would at least work for the default method, even if it is a little inelegant.

@Torvaney

This comment has been minimized.

Copy link
Owner Author

commented Aug 12, 2018

The easiest way to make model fitting faster is probably to compute the gradient of the the objective function, rather than relying on gradient-free methods...

@Torvaney

This comment has been minimized.

Copy link
Owner Author

commented Sep 14, 2018

Potentially a dumb idea, but given it's just matrices, could try any use the rstudio/keras api??

@Torvaney

This comment has been minimized.

Copy link
Owner Author

commented Nov 18, 2018

Closing for now since the kind of optimisation improvements I want would be their own project...

@Torvaney Torvaney closed this Nov 18, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.