Skip to content

A Python package of Machine Learning Algorithms implemented from scratch.

License

Notifications You must be signed in to change notification settings

hasnainroopawalla/ShowML

Repository files navigation

ShowML Logo


Develop Deploy PyPi version Python versions Downloads

A Python package of Machine Learning Algorithms implemented from scratch.

The aim of this package is to present the working behind fundamental Machine Learning algorithms in a transparent and modular way.

NOTE: The implementations of these algorithms are not thoroughly optimized for high computational efficiency.

📝 Table of Contents

🏁 Getting Started

To install the package directly from PyPi:

$ pip install showml

To clone the repository and view the source files:

$ git clone https://github.com/hasnainroopawalla/ShowML.git
$ cd ShowML
$ pip install -r requirements.txt

Remember to add ShowML/ to the PYTHONPATH environment variable before using locally:-

  • For Windows:
    $ set PYTHONPATH=%PYTHONPATH%;<path-to-directory>\ShowML
    
  • For MacOS:
    $ export PYTHONPATH=/<path-to-directory>/ShowML:$PYTHONPATH
    
  • For Linux:
    $ export PYTHONPATH="${PYTHONPATH}:/<path-to-directory>/ShowML"
    

Check out: showml/examples/

📦 Contents

ShowML currently includes the following content, however, this repository will continue to expand in order to include implementations of many more Machine Learning Algorithms.

Models

  • Linear

    • Linear Regression (showml.linear_model.regression.LinearRegression)
    • Logistic Regression (showml.linear_model.regression.LogisticRegression)
  • Non-Linear

    • Sequential (showml.deep_learning.model.Sequential)

Deep Learning

  • Layers

    • Dense (showml.deep_learning.layers.Dense)
  • Activations

    • Sigmoid (showml.deep_learning.activations.Sigmoid)
    • ReLu (showml.deep_learning.activations.Relu)
    • Softmax (showml.deep_learning.activations.Softmax)

Optimizers

  • Stochastic/Batch/Mini-Batch Gradient Descent (showml.optimizers.SGD)
  • Adaptive Gradient (showml.optimizers.AdaGrad)
  • Root Mean Squared Propagation (showml.optimizers.RMSProp)

Loss Functions

  • Mean Squared Error (showml.losses.MeanSquaredError)
  • Binary Cross Entropy (showml.losses.BinaryCrossEntropy)
  • Categorical Cross Entropy (showml.losses.CrossEntropy)

✏️ Contributing

  1. Fork the repository.
  2. Commit and push your changes to your own branch.
  3. Install and run the necessary housekeeping dependencies (pre-commit, mypy and pytest):
    $ pip install pre-commit mypy pytest
    
  4. Run these housekeeping checks locally and make sure all of them succeed (required for the CI to pass):-
    $ pre-commit run -a
    $ mypy .
    $ pytest
    
  5. Open a Pull Request and I'll review it.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.