The goal of this reposistory is to summarize the AI technologies and implementations.
The documents for the AI articles are mainly written using markdown format, for easily demonstrating the examples, we use docker to setup pytorch, jupyter notebook and other libraries environments. please refer to INSTASLL for the preparation of the environments setup.
- Linear Algebra Review and Reference(1)
- Linear Algebra Review and Reference(2)
- Entropy_Cross_Entropy_KL_Divergence
- Taylor_expansion_Multi_Variables_Functions_extremum
Topic | Key points | code or comments |
---|---|---|
Linear_Regression | 1. The errors between labels and predictions follow normal distribution 2. The samples joint probability also follows normal distribution |
code |
Logistic_Regression | 1. The sample probability follows Bernoulli distribution 2. The logistic function and its derivative's property 3. Maximize the log likelihood equals maximization of liklihood |
code |
Newton's Method | 1. The second order of derivative Hessian matrix's property. 2. Multi variable Taylor expansion |
code |
Generalized Linear Models | 1. exponential family distributions. 2. Construct GLM according to exponential family distributions |
softmax regression code |
Generative Learning algorithms | 1.Different from discriminative learning algorithms | |
Gaussian discriminant analysis | 1. The multivariate normal distribution. 2. GDA makes stronger modeling assumptions than logistic regression. |
|
Naive Bayes | 1. features are discrete-valued. 2. Features are conditionally independent given y |
|
Kernel Methods | 1. Feature mapping. | |
SVM | 1. Functional and geometric margins. 2. The optimal margin classifier. 3. Lagrange duality |
|
Learning Theory | ||
Adaboost | 1. Weak models 2. Additive models. 3. exponential loss function. |
算法原理及推导 |
Decision Tree | 1. Information Entropy. 2. Information gain |
|
Random Forest | 1. Bagging method. 2. out of bag error |
|
Tree Boosting | 1. Additive models. 2. forward step method |
CART回归树 GBDT XGBoost LightGBM |
PGM | HMM 的通俗解释 MRF |
|
Neural Networks | 1.Multi layer perception machine. 2. Back Propagation |
Back Propagation |
The k-means clustering algorithm | ||
Mixtures of Gaussians and the EM algorithm | 1. Jensen’s inequality. 2. latent random variables |
GMM The EM algorithm |
Factor analysis | ||
Principal components analysis | covariance | PCA |
Independent Components Analysis | ||
Reinforcement Learning and Control | MDP |
- References
- CNN
- RNN
- LSTM
- GAN
- VAE 知乎专题
- (强烈推荐)pytorch入门与实践
- PyTorchDocs
- stanford course
- ufldl/(http://ufldl.stanford.edu/tutorial/)
- AndrewNg 机器学习笔记
- 微软AI课程
- 哥伦比亚大学 应用机器学习
- CS231n: Convolutional Neural Networks for Visual Recognition
- How to setup local environment for python notebook?
- please refer notebook setup to setup up the environment
topic(ML) |
---|
SVM和LR的区别和联系 |
分类器模型评价指标 |
sigmoid和relu的优缺点 |
高斯核为什么说有无穷多维 |
傅里叶分析的复数形式 |
如何理解拉格朗日乘子法 |
- cs229 articles and samples preparation
- most popular applications introduction
- kaggle samples preparation