From Basic to Advanced
Currently under heavy reconstruction!
- add definitions/ introduction and detailed history of ML (from deep learning book)
- Prerequisites Math
- Prerequisites Computer Science
- 1. Basics Machine Learning
- 2. Advanced Machine Learning
- 3. Machine Learning Research
- Other Resources
Important are Linear Algebra, Probability Theory and Statistics, Regression, Multivariate Calculus, Algorithms and Complex Optimizations. Optional: Random Forest, SVMs, Naive Bayes, Gradient Boosted Methods, PCA:
- Course: Khan Academy’s - Intro Linear Algebra, Statistics, Calculus: Linear Algebra, Probability & Statistics, Multivariable Calculus and Optimization
- Course: MIT - Linear Algebra: linear equations, matrices multiplication, factorization, transposes, permutations, spaces R^n, Column space, nullspace, pivor variables, independence, basis, dimension, fundamental subspaces, graphs networks, incidence matrices, orthogonal vectors, gram-schmidt, properties of determinants, Eigenvalues, eigenvectors, differential equations, Markov matrices, complex matrices, singular value decomposition, linear transformations, pseudoinverse
- Course: Harvard/Edx - Intro to Statistics: Approx. 7 weeks to complete - Probability, Counting, and Story Proofs, Conditional Probability and Bayes' Rule, Discrete Random Variables, Continuous Random Variables, Averages, Law of Large Numbers, and Central Limit Theorem, Joint Distributions and Conditional Expectation, Markov Chains
- Course: Harvard - Statistics and Propability: Combinatorics, basic propability, conditional probability, random variables, expected values, condtional expectation, discrete distributions, continous distributions, jointyl distributed random variables, convergence, inequality, markoc chain
- Course: Coursera - Mathematics for Machine Learning: Approx. 2 months to complete - Linear Algebra (Vectors, Matrices), Multivariate Calculus (Multivariate chain rules, Taylor series, linerarisation, optimisation, regression), Principal Component Analysis (Inner Product, Orthogonal Projections)
Computer Science basics:
- Course: MIT/Edx - Introduction to Computer Science and Programming in Python: Approx. 9 weeks to complete Computation, Branching and iteration, String Manipulation, Guess and Check, Approximations, Bisection, Decomposition, Abstraction, Functions, Tuples, Lists, Aliasing, Mutability, Cloning, Recursion, Dictionaries, Testing, Debugging, Exceptions, Assertions, Object Oriented Programming, Python Classes, Inheritance, Programm Efficiency, Searching, Sorting
Programming languages: Python, NumPy, Octave/Mathlab and R
- Course: Coursera/RICE - An Introduction to Interactive Programming in Python (Part 1): Approx. 29 hours to complete; Statements, expressions, variables, Functions, logic, conditionals, Event-driven programming, local/global variables, canvas, drawing, timers, lists, keyboard input, the basics of modeling motion
- Course: Coursera/RICE - An Introduction to Interactive Programming in Python (Part 2): Approx. 25 hours to complete; mouse input, list methods, dictionaries, classes and object-oriented programming, basic game physics, sprites, sets and animation
Programms: Octave/Mathlab, Jupyter Notebooks, R
Mathlab introduction course
Python Libraries: Numpy,
- Numpy
- Scikit-Learn
- Matplotlib
- Pandas XGBoost library, visualization: seaborn, matplotlib scipy optimizer
Frameworks: Tensorflow, Keras, Torch, PyTorch, Caffe https://towardsdatascience.com/deep-learning-framework-power-scores-2018-23607ddf297a
-
Framework: Google (Open Source) - Tensorflow: Tensorflow is a scalable open-source machine learning library using the programming language python. It is used for both research and production at Google.
-
Framework: Francois Chollet (Open Source) - Keras: high-level API and open source neural network library written in Python
-
Framework: Torch:open-source machine learning library, writtin in Lua, C, CUDA, C++
-
Framework: Caffe and Caffe2 - Never played around with Caffe, but this was one of the first deep learning libraries out there. Caffe2 is notable because it's the production framework that Facebook uses to serve its models. According to Soumith Chintala, researchers at Facebook will try out new models and research ideas using PyTorch and will deploy using Caffe2.
-
Framework: Microsoft (Open Source) - CNTK: Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
-
Framework: Jupyter Notebook:open-source web application that allows you to create and share documents that contain live code, equations and visualizations. (https://www.dataquest.io/blog/advanced-jupyter-notebooks-tutorial/)
Wolfram, Machine learning Basic Courses - https://www.wolfram.com/wolfram-u/machine-learning-zero-to-AI-60-minutes/
Automated ML Frameworks:
-
Video: 3Blue1Brown - Neural Network Playlist on Neural Networks: Building blocks, Gradient descent, Backpropagation
-
Application: Tensorflow - Neural Network Playground: interactive visualization of neural networks
-
Course: Udemy - Deep Learning A-Z™: Hands-On Artificial Neural Networks: Artifical Neural Networks, Convolutional Neural Networks, Recurrent Neural Networks, Self Organizing Maps, Boltzmann Machines, AutoEncoders
-
Course: Udacity - Deep Learning: Approx. 3 months to complete - basic classification, gradient descent, relus, chain rule, backpropagation, L2 regularization, dropout, hyperparameter tuning, CNN, Word2Vec, tSNE, LSTM
https://hackernoon.com/my-self-created-ai-masters-degree-ddc7aae92d0e
-Machine Learning; Columbia University via edX
-
CMU: 2017 Fall: 10-707 Topics in Deep Learning
-
Google cloud coursera (3x) [Spezialisierung, Engineering, Advanced]
-
advanced ML https://www.coursera.org/specializations/aml
-
Kurs fast.ai
-
Machine Learning CS 229 Standford cheatsheets https://stanford.edu/~shervine/teaching/cs-229/
-
Tensorflow Tutorial
-
Udacity nano degree
Supervised, Unsupervised, Reinforcement learning
Artificial Neural Networks: If someone wants to get started with deep learning, I think that the best approach is to first get familiar with machine learning (which you all will have done by this point) and then start with neural networks. Following the same high level understanding -> model specifics -> code -> practical example approach would be great here as well.
- How Deep Neural Networks Work: Another great tutorial by Brandon Rohrer.
- A Friendly Introduction to Deep Learning and Neural Networks: Another visually appearing presentation of neural nets.
Convolutional Neural Networks:
A convolutional neural network is a special type of neural network that has been successfully used for image processing tasks.
- A Beginner's Guide to Understanding CNNs: Shameless plug LOL
- CS 231N Homepage: Stanford CS231N is a grad course focused on CNNs that was originally taught by Fei Fei Li, Andrej Karpathy, and others.
- CS 231N Video Lectures: All the lecture videos from 2016. There will likely be a playlist for 2017 somewhere on YouTube as well.
- Brandon Rohrer YouTube Tutorial: Great visuals on this tutorial video.
- Andrew Ng's CNN Course: Videos from Andrew Ng's deep learning course.
- Stanford CS 231N - CNNs their understanding of images
- Visualizing what ConvNets learn Stanford - http://cs231n.github.io/understanding-cnn/
- Feature Visualization - Feature VisualizationHow neural networks build up
- Visualizing CNN filters with keras (https://jacobgil.github.io/deeplearning/filter-visualizations)
Recurrent Neural Networks: A recurrent neural network is a special type of neural network that has been successfully used for natural language processing tasks.
- Deep Learning Research Paper Review: NLP: Too many shameless plugs or nah? LOL
- CS 224D Video Lectures: Stanford CS 224D is a grad course focused on RNNs and applying deep learning to NLP.
- RNNs and LSTMs: We all love Brandon honestly.
- Recurrent Neural Networks - Intel Nervana: Very comprehensive.
- Understanding LSTM Networks: Chris Olah's posts are readable, yet in-depth.
- Introduction to RNNs: Denny Britz is another great author who has a wide ranging blog.
Reinforcement Learning: While the 3 prior ML methods are necessarily important for understanding RL, a lot of recent progress in this field has combined elements from the deep learning camp as well as from the traditional reinforcement learning field. * David Silver's Reinforcement Learning Course: Advanced stuff covered here, but David is a fantastic lecturer and I loved the comprehensive content. * Simple Reinforcement Learning with Tensorflow: Arthur Juliani has a blog post series that covers RL concepts with lots of practical examples. * David Silver's Reinforcement Learning Course * Deep Reinforcement Learning Doesn't Work Yet * Deep RL Arxiv Review Paper * Pong From Pixels * Lessons Learned Reproducing a Deep RL Paper
- Book Sutton: https://www.amazon.de/Reinforcement-Learning-Introduction-Adaptive-Computation/dp/0262039249/ref=sr_1_1?ie=UTF8&qid=1549093669&sr=8-1&keywords=richard+reinforcement+learning -https://www.technologyreview.com/the-download/612438/an-old-fashioned-ai-has-won-a-starcraft-shootout/
Artificial Intelligence:
- Artificial Intelligence Podcast - https://lexfridman.com/ai/
Pretrained Models
-
Course: Coursera - Hintons Neural Networks for Machine Learning: Approx. 5 weeks to complete - Perceptron, backpropagation, vectors for words, object recogntion, neural nets, optimization, recurrent neural networks, combine multiple neural networks, Hopfield nets, Boltzmann machines, Restricted Boltzman machines (RBMs), Deep Belief Nets, generative pre-training, modeling hierarchical structure
-
Kaggle: Data-science competitions
- Harvard/Edx - Fundamentals of Neuroscience, Part 1: The Electrical Properties of the Neuron: Approx. 5 weeks to complete - Fundamentals of bioelectricity, resting potential, passive membranes, action potentials, nervous system
- Harvard/Edx - Fundamentals of Neuroscience, Part 2: Neurons and Networks: Approx. 6 weeks to complete - Synapses, neurons communication, interconnected neurons in neuronal circuits, neuromodulation in the firing of synapses
- Book: Bishop - Pattern Recognition and Machine Learning: probability theory, decision theory, information theory, probability distributions, binary/multinominal variables, gaussian distribtuion, exponential familiy, nonparametric methods, linear models for regression, bayesian linear regression, evidence approximation, linear models for classification, discrimination functions, probabilistic generative models, laplace approximation, kernel methods, sparse kernal machines
https://www.coursera.org/learn/probabilistic-graphical-models
Others:
-
Book Aurélien Géron (march 2019)
-
Book Santanu Pattanayak Pro Deep Learning with TensorFlow : A Mathematical Approach to Advanced Artificial Intelligence in Python
-
Gans in Action: Deep Learning with Generative Adversarial Network (2019)
Multi-layer graphical models/ deep generative models:
- Deep belief networks,
- Restricted Boltzman machines
- GAN
- graph networks
- gaussian process models for hyper parameters (hyperparameter search) -genetic algorithms, evolution strategies, reinforcement learning -depthwise separable convolutions
- neural turing machine (deepmind) encode: One-hot encode, k-hot encode (mathematical idea) loss: crossentropy, mean square error, absolut error (mathematical idea)
-delet courses or add them to other sections
-
Stanford CS 224D - Deep Learning for NLP
-
Stanford CS 229 - Pretty much the same as the Coursera course
Personally I would always prefer one book over 50 paper. But often you are unable to find a book as updated as a paper, then there is no way around. And if you know what you are looking for it’s nice.
2018 analysis: http://flip.it/SNv6ek top paper 2018: https://www.techleer.com/articles/517-a-list-of-top-10-deep-learning-papers-the-2018-edition/ review pf the last 20 years: https://www.technologyreview.com/s/612768/we-analyzed-16625-papers-to-figure-out-where-ai-is-headed-next/
- Arxiv: a repository of electronic preprints (known as e-prints)
- Arxiv Sanity: Web interface for browsing, search and filtering recent arxiv submissions
- Paperscape: Paperscape is an interactive map that visualises the arXiv
- AlexNet
- GoogLeNet
- VGGNet
- ZFNet
- ResNet
- R-CNN
- Fast R-CNN
- Adversarial Images
- Generative Adversarial Networks
- Spatial Transformer Networks
- DCGAN
- Synthetic Gradients
- Memory Networks
- Mixture of Experts
- Neural Turing Machines
- Alpha Go
- Atari DQN
- Word2Vec
- GloVe
- A3C
- Gradient Descent by Gradient Descent
- Rethinking Generalization
- Densely Connected CNNs
- EBGAN
- Wasserstein GAN
- Style Transfer
- Pixel RNN
- Dynamic Coattention Networks
- Convolutional Seq2Seq Learning
- Seq2Seq
- Dropout
- Batch Norm
- Large Batch Training
- Transfer Learning
- Adam
- Speech Recognition
- Relational Networks
- Influence Functions
- ReLu
- Xavier Initialization
- Saddle Points and Non-convexity of Neural Networks
- Overcoming Catastrophic Forgetting in NNs
- Quasi-Recurrent Neural Networks
- Escaping Saddle Points Efficiently
- Progressive Growing of GANs
- Attention is All You Need
- Dynamic Routing Between Capsules
- Unsupervised Machine Translation with Monolingual Corpora
- Population Based Training of NN's
- Learned Index Structures
- Visualizing Loss Landscapes
- DenseNet
- SqueezeNet
- WaveNet
- Hidden Technical Debt in ML Systems
- MobileNets
- Learning from Imbalanced Data
- Information theory and machine learning (https://export.arxiv.org/abs/1808.07593)
Conferneces for machine learning - http://www.guide2research.com/topconf/machine-learning
- Book: The Master Algorithm - Petro Domingos: Symbolists (rule System, inverse deduction), Connectionsis (backpropagation, deep learnin), Bayesians (HMM, graphical model), Evolutionaries (genetic algorithms, evolutionary programming), Analogizer (kNN, SVM)
- Book: Life 3.0: Being Human in the Age of Artificial Intelligence - Max Tegmark: implications of AI, future AGI
- Yann LeCun: Director of AI Research, Facebook and Founding Director of the NYU Center for Data Science
- Andrew NG: founder of Coursera, led Google Brain and was a former Chief Scientist at Baidu
- Geoffrey Hinton:Professor at University of Toronto and Research Scientist at the Google Brain
- Pieter Abbeel: Professor, UC Berkeley, EECS, BAIR, CHCAI and Research Scientist OpenAI
- Andrej Karpathy: director of artificial intelligence and Autopilot Vision at Tesla
- Neil Lawrence: Professor of Machine Learning at the University of Sheffield
- Moritz Hardt: Assistant Professor in Electrical Engineering and Computer Sciences University of California
- Yoshua Bengio: Professor Department of Computer Science and Operations Research Canada Research
- Jürgen Schmidhuber: co-director of the Dalle Molle Institute for Artificial Intelligence Research in Manno
- Ian Goodfellow: Research scientist at Google Brain
- Ilya Sutskever: Chief scientist of OpenAI, coinventor of AlexNet
- Wojciech Zaremba: Head of Robotics research at OpenAI
- Fei-Fei Li: Professor at the Computer Science Department at Stanford University
- Demis Hassabi: Cofounder of renowned artificial intelligence (AI) lab DeepMind
- Vladimir Vapnik: Co-inventor of the support vector machine method, and support vector clustering algorithm
- Michael I. Jordan: Professor at the University of California, Berkeley
- Christopher M. Bishop: Laboratory Director at Microsoft Research Cambridge
- Zoubin Ghahramani: Professor at Cambridge, the Alan Turing Institute and Chief Scientist of Uber
- Yoshua Bengio: Full Professor CS, head of the Machine Learning Laboratory, Montreal
- Ruslan Salakhutdinov: Director of AI research at Apple and Professor of computer science in Mellon
- Yuanqing Lin: Former Head of Baidu Research, now at AI startup Aibee.ai
- Jeff Dean: Lead of Google.ai
- Pete Warden: Lead of the TensorFlow Mobile/Embedded team
- Sebastian Ruder: PhD Student in Natural Language Processing and Deep Learning
- richard socher
- Google Research: Research department from Google
- Google Brain: deep learning artificial intelligence research team at Google
- Deepmind: Solve intelligence, use it to make the world a better place (company from Google)
- Waymo: develop machine learning solutions addressing open problems in autonomous driving
- Facebook AI Research: Research Unit from Facebook
- Qure.ai: AI for Radiology
- Baidu: specializing in Internet-related services and products and artificial intelligence
- Alibaba: specializing in e-commerce, retail, Internet, AI and technology
- Apple Research: Research department from Apple
- Vicarious: using the theorized computational principles of the brain to build software
- Salesforce Research: cloud-based software company
- Tencent: AI lab in Shenzhen with a vision to “Make AI Everywhere"
- Wechat: Chinese multi-purpose messaging, social media and mobile payment app
- QQ: instant messaging software service developed by the Chinese company Shenzhen Tencent
- Uber: Uber AI Labs
- OpenAI: Company founded by Elon Musk for AI safety
- Amazon: Research Blog from Amazon
- Microsoft: Research department from Mircosoft
- Bostondynamics: American engineering and robotics design company
- Ogma: Building AI using Neuroscience
- MIT: MIT Computer Science & Artificial Intelligence Lab
- Standfort: Stanford Artificial Intelligence Laboratory
- Cambridge: Cambridge Machine Learning Group
- Caltech: Computation & Neural Systems Toronto
- Berkeley AI Research: Berkeley Artificial Intelligence Research
- Berkeley ML Research: Machine Learning at Berkeley China Paris
- Website: Lab41: Lab41's blog exploring data science, machine learning, and artificial intelligence
- Website: The Gradient: A digital publication about artificial intelligence and the future
- Website: Off the Convex Path: technical blog about machine learning
- Website: Distill: Latest articles about machine learning
- Website: Towards Data Science: Sharing concepts, ideas, and codes
- Website: TOP500: TOP500 List of the world’s fastest supercomputers Top500 computer
- Podcast: Learning Machines 101: A Gentle Introduction to Artificial Intelligence
- Podcast: Datasceptic: Data science, statistics, machine learning, artificial intelligence, and scientific skepticism
Credits: Big thanks to all contributors to awesome lists (posted in other resources), which enabled me to find some of the courses contained in the list.