Skip to content

IACJ/ML-experience

Repository files navigation

机器学习练习 - ML_Coursera

实验1:线性回归 (Linear Regression)

  1. 单变量的线性回归 (Linear regression with one variable)
    • "To predict pro ts for a food truck"
    • Plotting Data
    • Gradient Descent
    • Visualizing J
  2. 多变量的线性回归 (Linear regression with multiple variables)
    • "To predict the prices of houses"
    • Feature Normalization
    • Gradient Descent
    • Normal Equations

实验2:逻辑回归(Logistic Regression)

  1. 两变量逻辑回归 (Logistic Regression)
    • "To predict whether a student gets admitted into a university"
    • Visualizing the data
    • Compute Cost and Gradient
    • Optimizing using fminunc
    • Predict and Accuracies
  2. 正则化的逻辑回归 (Regularized logistic regression)
    • "To predict whether microchips from a fabrication plant passes QA"
    • Visualizing the data
    • Feature mapping
    • Regularization and Accuracies

实验3:多元分类与神经网络(Multi-class Classification & Neural Networks)

  1. 多元分类 (Multi-class Classi cation)
    • "To recognize handwritten digits (from 0 to 9)"
    • Loading and Visualizing Data
    • Vectorize Logistic Regression
    • One-Vs-All
  2. 神经网络 (Neural Networks)
    • "To recognize handwritten digits"
    • 向前传播 (forward propagate)

实验4:神经网络学习( Neural Network Learning)

  1. 神经网络学习 (Neural Network Learning)
    • "To the task of hand-written digit recognition"
    • Loading and Visualizing Data
    • Compute Cost (Feedforward) *
    • Sigmoid Gradient
    • Implement Backpropagation *
    • Implement Regularization
    • Training NN
    • Visualize Weights *
    • Implement Predict
    • Gradient Checking

实验5:Regularized Linear Regression & Bias/Variance

  1. Regularized Linear Regression & Bias/Variance (Polynomial Regression)
    • "To predict the amount of water flowing out of a dam using the changeof water level in a reservoir"
    • "Go through some diagnostics of debugging learning algorithms and examine the effects of bias v.s. variance."
    • Loading and Visualizing Data
    • Train Linear Regression
    • Learning Curve for Linear Regression
    • Feature Mapping for Polynomial Regression(Normalize)
    • Learning Curve for Polynomial Regression (Different Lambda)
    • Validation for Selecting Lambda

实验6:Support Vector Machines (SVM)

  1. Support Vector Machines (SVMs)
    • "using support vector machines(SVMs) with various example 2D datasets"
    • try different values of C on this dataset
    • Gaussian Kernel
    • determined the best C and $ parameters to use
  2. Spam Classification with SVMs
    • "using SVMs to build your own spam filter"
    • Preprocessing Emails
    • Vocabulary List
    • Training SVM for Spam Classification
    • Test Spam Classification
    • Top Predictors of Spam
    • Try Your Own Emails

实验7:K-means Clustering & Principal Component

  1. K-means Clustering
    • "implement the K-means algorithm & use it for image compression"
    • Find Closest Centroids
    • Compute Means
    • K-Means Clustering
    • Random initialization
    • K-Means Clustering on Pixels
    • Use your own image (different K)
  2. Principal Component Analysis (PCA)
    • "use PCA to perform dimensionality reduction(Face Image Dataset)"
    • implement PCA
    • Dimension Reduction:Projecting the data onto the principal components
    • Reconstructing an approximation of the data
    • Dimension Reduction for Faces

实验8:Anomaly Detection and Recommender Systems

  1. Anomaly detection
    • "detect anomalous behavior in server computers"
    • Estimating parameters for a Gaussian
    • Selecting the threshold (F1 Score)
    • High dimensional dataset
  2. Recommender Systems (collaborative filtering learning algorithm)
    • "implement the collaborative fi ltering learning algorithm and apply it to a dataset of movie ratings"
    • Collaborative ltering cost function
    • Collaborative ltering gradient
    • Regularized

About

Machine Learning Exercise. 机器学习练习

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages