Course on Data Science, Machine Learning, Deep Learning (MastAI ki paathSHALA)
(Index Page: Hritik Jaiswal) Referred and tried to make like his repo and index page borrowed from his repo.
Assignment | Content |
---|---|
1 | Getting started : Python data structure, Loops, Classes, Linear Algebra |
2 | Basic data understanding: Data science, Central tendency, Plots, Cumulative distribution |
3 | Improving plots: :Different types of plots, How to customize plots |
4 | Basic statistics : Maximum likelihood estimation, sufficient statistics, null hypothesis testing, t-test, Wilcoxon rank test |
5 | Introduction to ML : Machine learning problems, parameter vs. hyperparameter, overfitting, training, validation, testing, cross-validation, regularization |
6 | Decision Trees : Definition of a decision tree, metrics of impurity, greedy algorithm to split a node, tree depth and pruning, ensemble of trees (random forest) |
7 | Bayesian decision theory : Bayes rule: Prior, likelihood, posterior, evidence, Gaussian density, sufficient statistics, maximum likelihood derivation for mean and covariance |
8 | Linear models : linear regression and its analytical solution, loss function, gradient descent and learning rate, logistic regression and its cost, SVM: hinge loss with L2 penalty |
9 | Kernelization: Dual form of an SVM, kernels for a dual form, examples of kernels and their typical uses, SVR in primal form, SVR in dual form |
10 | Feature selection and engineering : Normalization, text analysis, T-test, forward selection, features for images, features for audio, features for images, features for NLP, PCA, ZCA, K-PCA |
11 | Dense and shallow neural networks : Logistic regression as a sigmoid, single hidden layer using sigmoid and ReLU, approximation of any function using a single hidden layer, overfitting, advantage of multiple hidden layers, neural networks for regression, multi-regression, multi-classification using softmax, back propagation. |
12 | Advanced topics in neural networks: Weight initialization, momentum, weight decay, early stopping, batch SGD, advanced optimizers such as RMSprop and ADAM |
13 | Clustering: K-means, DB-SCAN, agglomerative clustering, scaling of dimensions, goodness of clustering |
14 | CNNs for Image classification: Applications of computer vision, implementation of convolution, building a convolutional neural network, image Classification using CNNs. |
After finishing the ML/DL courses, I completed a few projects on DataCamp, as given below. These projects made me utilize both ML and DL skills using Python.