邏輯迴歸(logistic regression)之實作範例
-
Updated
Jun 25, 2018 - Python
邏輯迴歸(logistic regression)之實作範例
Finding Donor for CharityML - Machine Learning Nanodegree from Udacity
A neural network (one input, three hidden, one output layer) to classify the MNIST handwritten digits
Basic level implementation of Neural Network
Predict a Pulsar Star using Stochastic Gradient Descent, K nearest neighbors, Support Vector Machine and Decision Tree classifiers.
Neural network-based character recognition using MATLAB. The algorithm does not rely on external ML modules, and is rigorously defined from scratch. A report is included which explains the theory, algorithm performance comparisons, and hyperparameter optimization.
Assignments of my CST Part II Deep Neural Networks unit
Project 2 Group C - Predicting FinTech Bootcamp Graduate Salaries
Fundamentals of Artificial Intelligence and Deep Learning Frameworks
Simple convolutional neural network (purely numpy) to classify the original MNIST dataset. My first project with a convnet. 🖼
Application and illustration of a wide range optimization methods
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
A simplified explanation of gradient descent for linear regression in python using numpy
Implement of SVM using Stochastic gradient descent - Stony Brook CSE512 Machine learning
Compilation of different ML algorithms implemented from scratch (and optimized extensively) for the courses COL774: Machine Learning (Spring 2020) & COL772: Natural Language Processing (Fall 2020)
Utilizing deep learning to deblur images
This repository includes implementation of the basic optimization algorithms (Batch-Mini-stochatic)Gradient descents and NAG,Adagrad,RMSProp and Adam)
A project performing gradient descent and stochastic average gradient descent for matrix completion. The algorithms are tested on some synthetic data before being used on downscaled real X-ray absorption data from a spectromicroscopy experiment. The algorithms' behaviours and outputs are examined in the report.
Optimize Neural Network - Stochastic Gradient Descent algorithm.
Add a description, image, and links to the stochastic-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the stochastic-gradient-descent topic, visit your repo's landing page and select "manage topics."