Skip to content

ee2110/Machine_Learning_with_TensorFlow

Repository files navigation

Machine Learning

As a general recall about neural network in machine learning, neural networks is a series of algorithm that mimic human biological neural network. It consists of a set of connected nodes or called neuron and trained by gradient descent. The weights in each layer begin with random values, and these are iteratively improved over time to make the network more accurate. A loss function is used to quantify how inaccurate the network is, and a procedure called backpropagation is used to determine whether each weight should be increased, or decreased, to reduce the loss.

TensorFlow

Thanks to TensorFlow and Google Brain team!

TensorFlow help to make developer and engineer's life easier by providing comprehensive machine learning algorithm toolkits and libraries with simple APIs. For example, we do not need to build the optimizer part from scratch since TensorFlow has many built-in optimizer functions for us to call with just one simple line code.

Supervised Learning

Supervised learning is the machine learning task of learning a function that maps an input to an output based on example input-output pairs. It infers a function from labeled training data consisting of a set of training examples.

The most widely used learning algorithms are:

  • Support Vector Machines
  • Linear Regression
  • Logistic Regression
  • Naive Bayes
  • Linear Discriminant Analysis
  • Decision Trees
  • K-nearest Neighbor Algorithm
  • Neural Networks (Multilayer perceptron)
  • Similarity Learning

Activation Functions

  1. Sigmoid Function Usage
  2. The Sigmoid function used for binary classification in logistic regression model. While creating artificial neurons sigmoid function used as the activation function. In statistics, the sigmoid function graphs are common as a cumulative distribution function. The sigmoid function returns a real-valued output. The first derivative of the sigmoid function will be non-negative or non-positive. Non-Negative: If a number is greater than or equal to zero. Non-Positive: If a number is less than or equal to Zero.

  3. Properties of Softmax Function Below are the few properties of softmax function.

    The calculated probabilities will be in the range of 0 to 1. The sum of all the probabilities is equals to 1. Softmax Function Usage Used in multiple classification logistic regression model. In building neural networks softmax functions used in different layer level.

TensorFlow

Thanks to TensorFlow and Google Brain team!

TensorFlow help to make developer and engineer's life easier by providing comprehensive machine learning algorithm toolkits and libraries with simple APIs. For example, we do not need to build the optimizer part from scratch since TensorFlow has many built-in optimizer functions for us to call with just one simple line code.

sources:

  1. Predictive analytics and machine learning By Katrina Wakefield, Marketing, SAS UK

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published