Skip to content
This repository

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Browse code

Some small quick changes. (nothing important)

Added methods we would like to add to the regression library and a note to the readme about how I would like to modify the regression library before moving on to other library elements.
  • Loading branch information...
commit 74a6c797a621e444784b087d4c4b53412107d349 1 parent f1eeffc
Giuseppe Burtini authored December 04, 2011
8  README
@@ -19,3 +19,11 @@ If you need to work with the Learning Library in an environment
19 19
 that is not conducive to the GPL, please contact me at <joe@truephp.com>
20 20
 and we can discuss alternative licensing terms.
21 21
 
  22
+--
  23
+
  24
+I would like to rewrite the regression stuff to be more generic
  25
+in particular, it should have an optimization function that takes
  26
+two functions as inputs ($objective, $derivatives($number)) and
  27
+returns a maximum accordingly. That would allow the methods to
  28
+be genericized over different types of problems (say linear vs.
  29
+logistic).
14  lib/parametric/regression.php
@@ -10,7 +10,7 @@
10 10
 
11 11
       $xs represents the data you want to regress on.
12 12
       $ys represents the list of "answers" (i.e., y = b0 + b1x1 + b2x2) 
13  
-      $method is either gradient or normal
  13
+      $method is either gradient or normal (right now)
14 14
       $alpha is used in gradient descent and represents the "learning rate", set this as high as you can get away with.
15 15
       $initialization is the vector of values to start your b0, b1, b2 values at during the gradient descent. if you don't pass it, will use a vector of 0s.
16 16
       $repetitions is the number of times to repeat. This is required unless LL_AUTODETECT_CONVERGENCE is defined to be a floating point value, in which case, we will repeat until we're within that distance from the previous iteration.
@@ -26,6 +26,18 @@ function ll_linear_regression($xs, $ys, $method="gradient", $alpha=null, $initia
26 26
             return _ll_gradient_descent($xs, $ys, $initialization, $alpha, $repetitions);
27 27
          break;
28 28
 
  29
+         case "stochastic":
  30
+            trigger_error("Stochastic gradient descent not implemented.");
  31
+         break;
  32
+
  33
+         case "conjugate":
  34
+            trigger_error("Conjugate gradient not implemented.");
  35
+         break;
  36
+
  37
+         case "bfgs": case "lbfgs": case "l-bfgs":
  38
+            trigger_error("BFGS not implemented.");
  39
+         break;
  40
+
29 41
          case "normal": 
30 42
             // compute the normal equations method... parameters = (x' x)^-1 x' y
31 43
             return _ll_normal_equation($xs, $ys);

0 notes on commit 74a6c79

Please sign in to comment.
Something went wrong with that request. Please try again.