Join GitHub today
GitHub is home to over 36 million developers working together to host and review code, manage projects, and build software together.Sign up
TST/ENH StandardizeTransform, reparameterize TestProbitCG #1699
referenced this pull request
May 27, 2014
strange python 2.6 on TravisCI still needs 59 fcalls for convergence when we are supposed to be at optimum already (with different parameterization, different scaling of gradient/score)
The Transformation class still needs a bit more thought, mainly the API. I wrote it now quickly to fix the test problem.
In a similar form I wrote a reparameterization for linear constraints based on the Stata manual, which is essentially the same as Kerby's in GEE, AFAICS.
Each version has different method names and a slightly different structure.
I haven't looked at the actual use only understand the intention, but should it follow the StatefulTransform protocol?
It's a bit similar, in this case to the Standardize transform https://github.com/pydata/patsy/blob/master/patsy/state.py#L123
However, it's also pretty different:
The parameter transformation part is similar to the stationary AR transform in that it needs to take all parameters into account, and it's similar to the other parameter transformation that we use in fit (and turn on and off during estimation).
The usecase of Kerby, and in Stata, is to transform the model (both data and parameters) to impose linear constraints of the form
A Principal Component Regression that returns the original parameterization would be similar:
The usecase in this issue is similar to our parameter transformation during fit to improve optimization. Only in this case I do it from outside of the model because I need a new model with transformed exog.
I don't know, but I'm a bit doubtful whether we can integrate this cleanly into the