Here's the progress for #100DaysOfMLCode Challenge.
Find my Blog here
- Collection of Study Material
- Setup the Programming Environment, i.e, installing dependencies
Really Excited to learn Machine Learning in a more effective way
- Learnt about Linear and Logistic Regression
- Collected Dataset from Kaggle about Breast Cancer
- Modified the dataset to make use of it and understood the structure of a dataset
- Trained a logistic regression classifier to predict Breast Cancer to be Maligant or Benign using Naive Bayes
- Used Scikit Learn Library for Logistic Regression
It was full of fun, learnt a lot about dealing with rank 1 arrays as well as keeping a track on the shape of the matrices.
- Learnt about Vectorization in detail and its importance in Machine Learning practices
- Completed the Machine Learning Crash Course till Generalization
Now I realized why matrix manipulation is important. Vectorization actually helps a lot and it is the key feature because of which now we are able to run Machine Learning Algorithms much more faster.
- Continued my Deep Learning Specialization on Coursera
- Learnt more about Gradient Descent and how it works
- Studied various methods of regularizing the Neural Network to prevent overfitting
Gradient Descent is a great algorithm in Machine Learning. Enjoyed a lot while learning about this algorithm and the methods to prevent overfitting.
- Learnt about Mini-batch Gradient Descent and a bit about Stochastic Gradient Descent
- Made some progress with Machine Learning Crash Course.
It's actually great to make progress before looking through the entire training set by using small batches and performing Gradient Descent on each such batch.
- Implemented a Linear Regression Model on the Boston Dataset.
- Used Scikit Learn Library for the dataset and the Linear Regression Model.
- Used Matplot Lib to plot the predictions.
It was challenging to train the Linear Regression Model on a dataset I downloaded from Kaggle, so ended up using the Boston Dataset.
- Made some progress with the Deep Learning Specialization on Coursera
Not much for the day but got to learn new things.
- Made some with the ML Crash Course by Google.
- Learnt about Exponentially Weighted Averages and the Bias Correction in Exponentially Weighted Averages.
Got to learn some new Optimization Algorithms
- Learnt about Adam Optimization Algorithm
- Implemented Adam Optimization Algorithm with the Deep Learning Course Assignments
Haven't thought of any other Optimization Algorithm to me more efficient as compared to Gradient Descent.
- Learnt and revised about basics of Building a Neural Network, its working, forward prop and backward prop
- Made some progress with ML Crash Course
Thinking of getting started with Tensorflow for Machine Learning and Deep Learning Applications
- Implemented Deep Dream on Live Video Stream using OpenCV
It was bit challenging, still the video stream lags
- Learnt about Batch Normalization and its working
- Made some progress with the Deep Learning Specialization on Coursera
Batch Normalization is a great technique to normalize the activations in a Neural Network. Enjoyed learning about its working
- Started with TensorFlow Framework
- Learned about basics of Tensors and how to run a session in tensorflow
Machine Learning Frameworks are cool. Only one line of code reduces too much of code and complexity.
- Completed the Second Course of Deep Learning Specialization on Coursera(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization)
Really Enjoyed a lot during the course and learnt a lot about tuning Hyperparameters, various Optimization Algorithms and much more.
- Started with the Course 3 of Deep Learning Specialization
Not much for the day
- Completed the third Course of Deep Learning Specialization on Coursera(Structuring Machine Learning Projects)
Python is a great programming language. It is a bit difficult to switch to python from java but its worthy to do so.
- Made some progress with Machine Learning Crash Course
- Saw some videos on Machine Learning with Python
Not much for the week (PS: University Exams wont let you do something Interesting and Beneficial).
- Started with the concepts of Convolutional Neural Networks
- Learnt about the pooling and padding concepts in CNNs
- Preparing for the Hackathon Project on Deep Learning with Android.
- Won the Hackathon with our Android Application that classifies Melanoma skin cancer remotely with nearby hospital finder functions.
That was my first hackathon I've won. Such an amazing experience to apply Deep Learning with Android Application.
- Learnt about the classic architectures of CNNs:
LeNet-5
Alex Net
VGG-16
ResNets
- Completed Deep Learning Specialization by deeplearning.ai
Finally completed this amazing specialization. Thanks to Coursera and Andrew Ng for this course and an amazing content.
- Getting started with Tensorflow.js
- Learnt about how to create tensors and some basic operations
Got the motivation from already implemented applications like posenet
- Implemented XOR Function using Neural Networks in Tensorflow
- Learned about the use of one-hot encoding in machine learning