JavaScript Python CSS HTML Puppet Shell Other
Switch branches/tags
Clone or download
jeff1evesque Merge pull request #3275 from jeff1evesque/feature-3260
#3260: Determine associative mining packages
Latest commit d6b283a Jul 26, 2018
Permalink
Failed to load latest commit information.
brain #2935: password.py, remove nested hiera syntax Mar 18, 2018
doc #3268: rancher.rst, update application screenshot Jun 22, 2018
dockerfile #2171: browserify.dockerfile, restore file Jun 27, 2018
hiera #3260: packages.yaml, add 'mlxtend' pip package Jul 26, 2018
interface #2935: index.html, use non-minified 'content.js' May 1, 2018
log #2781: remove unnecessary '@name' in docstring Jan 20, 2017
puppet #3208: resolve merge conflict with 'master' Jun 12, 2018
src #3273: main-route.jsx, use 'className' Jun 28, 2018
test #3268: update to '0.8' container version Jun 20, 2018
.coveragerc #2697: coveragerc, add file Jan 16, 2017
.gitignore #3240: gitignore, reduce syntax Jun 10, 2018
.travis.yml #3087: refactor npm scripts May 31, 2018
README.md #3266: README.md, remove redundant '.html' extension Jun 14, 2018
__init__.py #2781: remove unnecessary '@name' in docstring Jan 20, 2017
app.py #2935: restructure app with 'api', 'web', 'test' Apr 25, 2018
contributing.json #2860: set commit message limit < 65 characters Jan 22, 2017
contributing.md #2844: contributing.md, merge master into feature-2844 Aug 4, 2017
docker-compose.rancher.yml #3268: update to '0.8' container version Jun 20, 2018
docker-compose.yml #3271: more verbose 'volumnes' mount for browserify Jun 27, 2018
factory.py #2935: implement consistent 'flask_log_path' Apr 26, 2018
frontend.dockerfile #3087: frontend.dockerfile, copy 'interface/' Feb 12, 2018
hiera.yaml #2935: hiera.yaml, define general 'nginx.yaml' May 1, 2018
install_rancher #3271: install_rancher, remove redundant 'data/' logic Jun 24, 2018
license.md #2890: license.md, extend date to 2017 Jan 31, 2017
rancher-template.yml #3271: browserify.dockerfile, remove duplicate Jun 27, 2018

README.md

Machine Learning Build Status Coverage Status

This project provides a web-interface, as well as a programmatic-api for various machine learning algorithms.

Supported algorithms:

Contributing

Please adhere to contributing.md, when contributing code. Pull requests that deviate from the contributing.md, could be labelled as invalid, and closed (without merging to master). These best practices will ensure integrity, when revisions of code, or issues need to be reviewed.

Note: support, and philantropy can be inquired, to further assist with development.

Configuration

Fork this project, using of the following methods:

  • simple clone: clone the remote master branch.
  • commit hash: clone the remote master branch, then checkout a specific commit hash.
  • release tag: clone the remote branch, associated with the desired release tag.

Installation

To proceed with the installation for this project, users will need to decide whether to use the rancher ecosystem, or use docker-compose. The former will likely be less reliable, since the corresponding install script, may not work nicely across different operating systems. Additionally, this project will assume rancher as the primary method to deploy, and run the application. So, when using the docker-compose alternate, keep track what the corresponding endpoints should be.

If users choose rancher, both docker and rancher must be installed. Installing docker must be done manually, to fulfill a set of dependencies. Once completed, rancher can be installed, and automatically configured, by simply executing a provided bash script, from the docker quickstart terminal:

cd /path/to/machine-learning
./install-rancher

Note: the installation, and the configuration of rancher, has been outlined if more explicit instructions are needed.

If users choose to forgo rancher, and use the docker-compose, then simply install docker, as well as docker-compose. This will allow the application to be deployed from any terminal console:

cd /path/to/machine-learning
docker-compose up

Note: the installation, and the configuration of docker-compose, has been outlined if more explicit instructions are needed.

Execution

Both the web-interface, and the programmatic-api, have corresponding unit tests which can be reviewed, and implemented. It is important to remember, the installation of this application will dictate the endpoint. More specifically, if the application was installed via rancher, then the endpoint will take the form of https://192.168.99.101:XXXX. However, if the docker-compose up alternate was used, then the endpoint will likely change to https://localhost:XXXX, or https://127.0.0.1:XXXX.

Web Interface

The web-interface, can be accessed within the browser on https://192.168.99.101:8080:

web-interface

The following sessions are available:

  • data_new: store the provided dataset(s), within the implemented sql database.
  • data_append: append additional dataset(s), to an existing representation (from an earlier data_new session), within the implemented sql database.
  • model_generate: using previous stored dataset(s) (from an earlier
  • data_new, or data_append session), generate a corresponding model into
  • model_predict: using a previous stored model (from an earlier model_predict session), from the implemented nosql datastore, along with user supplied values, generate a corresponding prediction.

When using the web-interface, it is important to ensure the csv, xml, or json file(s), representing the corresponding dataset(s), are properly formatted. Dataset(s) poorly formatted will fail to create respective json dataset representation(s). Subsequently, the dataset(s) will not succeed being stored into corresponding database tables. This will prevent any models, and subsequent predictions from being made.

The following dataset(s), show acceptable syntax:

Note: each dependent variable value (for JSON datasets), is an array (square brackets), since each dependent variable may have multiple observations.

Programmatic Interface

The programmatic-interface, or set of API, allow users to implement the following sessions:

  • data_new: store the provided dataset(s), within the implemented sql database.
  • data_append: append additional dataset(s), to an existing representation (from an earlier data_new session), within the implemented sql database.
  • model_generate: using previous stored dataset(s) (from an earlier
  • data_new, or data_append session), generate a corresponding model into
  • model_predict: using a previous stored model (from an earlier model_predict session), from the implemented nosql datastore, along with user supplied values, generate a corresponding prediction.

A post request, can be implemented in python, as follows:

import requests

endpoint = 'https://192.168.99.101:9090/load-data'
headers = {
    'Authorization': 'Bearer ' + token,
    'Content-Type': 'application/json'
}

requests.post(endpoint, headers=headers, data=json_string_here)

Note: more information, regarding how to obtain a valid token, can be further reviewed, in the /login documentation.

Note: various data attributes can be nested in above POST request.

It is important to remember that the docker-compose.development.yml, has defined two port forwards, each assigned to its corresponding reverse proxy. This allows port 8080 on the host, to map into the webserver-web container. A similar case for the programmatic-api, uses port 9090 on the host.