Optimalization – finding parameters of linear regression using various algorithms
-
Updated
Jun 30, 2024 - Python
Optimalization – finding parameters of linear regression using various algorithms
Custom implementation of a neural network from scratch using Python
Robust Mini-batch Gradient Descent models
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Performing gradient descent for calculating slope and intercept of linear regression using sum square residual or mean square error loss function.
Implementação em Python de uma rede neural perceptron de multicamadas (multilayer perceptron) treinada com Mini-Batch Gradient Descent
Custom multilayer perceptron (MLP)
Various methods for Deep Learning, SGD and Neural Networks.
rede neural totalmente conectada, utilizando mini-batch gradient descent e softmax para classificação no dataset MNIST
Short description for quick search
A basic neural net built from scratch.
MNIST Handwritten Digits Classification using 3 Layer Neural Net 98.7% Accuracy
3-layer linear neural network to classify the MNIST dataset using the TensorFlow
Regression models on Boston Houses dataset
This project explored the Tensorflow technology, tested the effects of regularizations and mini-batch training on the performance of deep neural networks
classify mnist datasets using ridge regression, optimize the algorithem with SGD, stochastic dual coordinate ascent, and mini-batching
Add a description, image, and links to the mini-batch-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the mini-batch-gradient-descent topic, visit your repo's landing page and select "manage topics."