Skip to content

Commit

Permalink
Added resources and updated readme for BetaML
Browse files Browse the repository at this point in the history
  • Loading branch information
sylvaticus committed Sep 10, 2020
1 parent f9113cd commit da5ba66
Show file tree
Hide file tree
Showing 8 changed files with 11 additions and 1 deletion.
9 changes: 8 additions & 1 deletion README.md
Expand Up @@ -14,7 +14,14 @@ _Disclaimer: The following notes are a mesh of my own notes, selected transcript
<!--(PDF versions <del>may be </del> are <del>slightly</del> outdated)-->
(PDF versions may be slightly outdated)

For an implementation of the algorithms _in Julia_, see the companion repository "Beta Machine Learning Toolkit" [on GitHub](https://github.com/sylvaticus/Bmlt.jl ) or in [myBinder](https://mybinder.org/v2/gh/sylvaticus/Bmlt.jl/master) to run the code online by yourself (and if you are looking for an introductory book on Julia, have a look on [my one](https://www.julia-book.com/)).
--------------------------------------------------------------------------------
For an implementation of the algorithms _in Julia_ (a relatively recent language incorporating the best of R, Python and Matlab features with the efficiency of compiled languages like C or Fortran), see the companion repository "Beta Machine Learning Toolkit" [on GitHub](https://github.com/sylvaticus/BetaML.jl ) or in [myBinder](https://mybinder.org/v2/gh/sylvaticus/BetaML.jl/master) to run the code online by yourself (and if you are looking for an introductory book on Julia, have a look on [my one](https://www.julia-book.com/)).
BetaML currently implements:
- Linear, average and kernel Perceptron (units 1 and 2)
- Feed-forward Neural Networks (unit 3)
- Clustering (k-means, k-medoids and EM algorithm), recommandation system based on EM (unit 4)
- Decision Trees / Random Forest (mentioned on unit 2)
--------------------------------------------------------------------------------

[PDF all in one document](MITx_6.86x_notes.md.pdf)

Expand Down
Expand Up @@ -335,6 +335,9 @@ The update function $\theta = \theta + x^i*y^i$ becomes then $\array{\theta\\\th
So now we have a general learning algorithm.
The simplest one, but it can be generalized to be quite powerful and therefore, hence it is a useful algorithm to understand.

For example, we saw here a binary classification, but it can be easily be extended to a multiclass classification employing a "one vs all" strategy, as explained in [this SO answer](https://stats.stackexchange.com/a/87585/263905) or on [wikipedia](http://en.wikipedia.org/wiki/Multiclass_classification).


_Code implementation: functions `perceptron_single_step_update()`, `perceptron()`, `average_perceptron()`, `pegasos_single_step_update()` and `pegasos()` in project1._


Expand Down
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

0 comments on commit da5ba66

Please sign in to comment.