From 05dbff64d5570912c11bec98ec86be11a7ca75a2 Mon Sep 17 00:00:00 2001 From: Mark Linderman Date: Fri, 21 Apr 2023 11:27:43 -0400 Subject: [PATCH 1/3] text change --- ML Notes.md | 1 + 1 file changed, 1 insertion(+) diff --git a/ML Notes.md b/ML Notes.md index 9c71ec0..2532be1 100644 --- a/ML Notes.md +++ b/ML Notes.md @@ -1,6 +1,7 @@ ## Basic hypothesis (model): $$ h_\theta(x) = \theta_o x_0 + \theta_1 x_1 + \theta_2 x_2....\theta_n x_n $$ + where $\theta_n$ is a parameter value to be calculated and $x_n$ is the value for the feature in the sample (training) data (with n features) Basic cost function: From 51f6a3a739660896f26389a86043ae60a1c16d62 Mon Sep 17 00:00:00 2001 From: Mark Linderman Date: Fri, 21 Apr 2023 11:42:52 -0400 Subject: [PATCH 2/3] more tweaks --- ML Notes.md | 1 + 1 file changed, 1 insertion(+) diff --git a/ML Notes.md b/ML Notes.md index 2532be1..5dbfeef 100644 --- a/ML Notes.md +++ b/ML Notes.md @@ -26,6 +26,7 @@ Gradient descent uses partial derivatives with respect to each $\theta$ value, o for each iteration, calculate new $\theta$ values: + $$ \theta_0 = \theta_0 - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{i}) - y^{i})*x_0^{(i)}) $$ $$ \theta_1 = \theta_1 - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{i}) - y^{i})*x_1^{(i)}) $$ ... From 270144b66f137af65f29f590de2f64d9c08d7153 Mon Sep 17 00:00:00 2001 From: Mark Linderman Date: Fri, 21 Apr 2023 11:48:19 -0400 Subject: [PATCH 3/3] more tweaks to test checks --- ML Notes.md | 1 + 1 file changed, 1 insertion(+) diff --git a/ML Notes.md b/ML Notes.md index 5dbfeef..0f82026 100644 --- a/ML Notes.md +++ b/ML Notes.md @@ -29,6 +29,7 @@ for each iteration, calculate new $\theta$ values: $$ \theta_0 = \theta_0 - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{i}) - y^{i})*x_0^{(i)}) $$ $$ \theta_1 = \theta_1 - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{i}) - y^{i})*x_1^{(i)}) $$ + ... $$ \theta_n = \theta_n - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{i}) - y^{i})*x_n^{(i)}) $$