Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Browse files

Updated examples

  • Loading branch information...
commit 51d5bb5385dd90edcbc71f0b23764efa706cfb9d 1 parent 6a66123
Ian Langmore langmore authored josef-pkt committed
Showing with 53 additions and 46 deletions.
  1. +0 −43 README.txt
  2. +40 −0 README_l1.txt
  3. +13 −3 l1_demo/short_demo.py
43 README.txt
View
@@ -1,46 +1,3 @@
-What the l1 addition is
-=======================
-A slight modification that allows l1 regularized LikelihoodModel.
-
-Regularization is handled by a fit_regularized method.
-
-Main Files
-==========
-
-l1_demo/demo.py
- $ python demo.py --get_l1_slsqp_results logit
- does a quick demo of the regularization using logistic regression.
-
-l1_demo/sklearn_compare.py
- $ python sklearn_compare.py
- Plots a comparison of regularization paths. Modify the source to use
- different datasets.
-
-statsmodels/base/l1_cvxopt.py
- fit_l1_cvxopt_cp()
- Fit likelihood model using l1 regularization. Use the CVXOPT package.
- Lots of small functions supporting fit_l1_cvxopt_cp
-
-statsmodels/base/l1_slsqp.py
- fit_l1_slsqp()
- Fit likelihood model using l1 regularization. Use scipy.optimize
- Lots of small functions supporting fit_l1_slsqp
-
-statsmodels/base/l1_solvers_common.py
- Common methods used by l1 solvers
-
-statsmodels/base/model.py
- Likelihoodmodel.fit()
- 3 lines modified to allow for importing and calling of l1 fitting functions
-
-statsmodels/discrete/discrete_model.py
- L1MultinomialResults class
- Child of MultinomialResults
- MultinomialModel.fit()
- 3 lines re-directing l1 fit results to the L1MultinomialResults class
-
-
-
What Statsmodels is
===================
What it is
40 README_l1.txt
View
@@ -0,0 +1,40 @@
+What the l1 addition is
+=======================
+A slight modification that allows l1 regularized LikelihoodModel.
+
+Regularization is handled by a fit_regularized method.
+
+Main Files
+==========
+
+l1_demo/demo.py
+ $ python demo.py --get_l1_slsqp_results logit
+ does a quick demo of the regularization using logistic regression.
+
+l1_demo/sklearn_compare.py
+ $ python sklearn_compare.py
+ Plots a comparison of regularization paths. Modify the source to use
+ different datasets.
+
+statsmodels/base/l1_cvxopt.py
+ fit_l1_cvxopt_cp()
+ Fit likelihood model using l1 regularization. Use the CVXOPT package.
+ Lots of small functions supporting fit_l1_cvxopt_cp
+
+statsmodels/base/l1_slsqp.py
+ fit_l1_slsqp()
+ Fit likelihood model using l1 regularization. Use scipy.optimize
+ Lots of small functions supporting fit_l1_slsqp
+
+statsmodels/base/l1_solvers_common.py
+ Common methods used by l1 solvers
+
+statsmodels/base/model.py
+ Likelihoodmodel.fit()
+ 3 lines modified to allow for importing and calling of l1 fitting functions
+
+statsmodels/discrete/discrete_model.py
+ L1MultinomialResults class
+ Child of MultinomialResults
+ MultinomialModel.fit()
+ 3 lines re-directing l1 fit results to the L1MultinomialResults class
16 l1_demo/short_demo.py
View
@@ -11,9 +11,6 @@
The l1 Solvers
--------------
-The solvers are slower than standard Newton, and sometimes have
- convergence issues Nonetheless, the final solution makes sense and
- is often better than the ML solution.
The standard l1 solver is fmin_slsqp and is included with scipy. It
sometimes has trouble verifying convergence when the data size is
large.
@@ -36,14 +33,19 @@
logit_mod = sm.Logit(spector_data.endog, spector_data.exog)
## Standard logistic regression
logit_res = logit_mod.fit()
+
## Regularized regression
+
# Set the reularization parameter to something reasonable
alpha = 0.05 * N * np.ones(K)
+
# Use l1, which solves via a built-in (scipy.optimize) solver
logit_l1_res = logit_mod.fit_regularized(method='l1', alpha=alpha, acc=1e-6)
+
# Use l1_cvxopt_cp, which solves with a CVXOPT solver
logit_l1_cvxopt_res = logit_mod.fit_regularized(
method='l1_cvxopt_cp', alpha=alpha)
+
## Print results
print "============ Results for Logit ================="
print "ML results"
@@ -58,15 +60,19 @@
anes_exog = sm.add_constant(anes_exog, prepend=False)
mlogit_mod = sm.MNLogit(anes_data.endog, anes_exog)
mlogit_res = mlogit_mod.fit()
+
## Set the regularization parameter.
alpha = 10 * np.ones((mlogit_mod.J - 1, mlogit_mod.K))
+
# Don't regularize the constant
alpha[-1,:] = 0
mlogit_l1_res = mlogit_mod.fit_regularized(method='l1', alpha=alpha)
print mlogit_l1_res.params
+
#mlogit_l1_res = mlogit_mod.fit_regularized(
# method='l1_cvxopt_cp', alpha=alpha, abstol=1e-10, trim_tol=1e-6)
#print mlogit_l1_res.params
+
## Print results
print "============ Results for MNLogit ================="
print "ML results"
@@ -74,16 +80,19 @@
print "l1 results"
print mlogit_l1_res.summary()
#
+#
#### Logit example with many params, sweeping alpha
spector_data = sm.datasets.spector.load()
X = spector_data.exog
Y = spector_data.endog
+
## Fit
N = 50 # number of points to solve at
K = X.shape[1]
logit_mod = sm.Logit(Y, X)
coeff = np.zeros((N, K)) # Holds the coefficients
alphas = 1 / np.logspace(-0.5, 2, N)
+
## Sweep alpha and store the coefficients
# QC check doesn't always pass with the default options.
# Use the options QC_verbose=True and disp=True
@@ -94,6 +103,7 @@
method='l1', alpha=alpha, trim_mode='off', QC_tol=0.1, disp=False,
QC_verbose=True, acc=1e-15)
coeff[n,:] = logit_res.params
+
## Plot
plt.figure(1);plt.clf();plt.grid()
plt.title('Regularization Path');
Please sign in to comment.
Something went wrong with that request. Please try again.