Skip to content

jiayuzhou/CSE847-2016Spring

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CSE847 Machine Learning, 2016 Spring

Instructor: Jiayu Zhou [email]

Time: Tuesday and Thursday 2:40pm - 4:00pm

Location: Engineering Building, EB 2205

Office Hours: Tuesday and Thursday 4:00pm-5:00pm, EB 2134

Course Description

Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., that learn to spot high-risk medical patients, recognize speech, classify text documents, detect credit card fraud, or drive autonomous robots). This course provides an in-depth understanding of machine learning and statistical pattern recognition techniques and their applications in biomedical informatics, computer vision, and other domains.

Topics: probability distributions, regression, classification, kernel methods, clustering, semi-supervised learning, mixture models, graphical models, dimensionality reduction, manifold learning, sparse learning, multi-task learning, transfer learning, and Hidden Markov Models.

Homework assignments include both theoretic derivation and hands-on experiments with various learning algorithms. Every student is required to finish a project that is either assigned by the intructor or designed by the student himself.

Course Announcements

Announcements will be emailed to the course mailing list. A welcome note will be sent to the mailing list at the beginning of the semester. If you do not receive the welcome message before the first class, please send mail to me.

Policies

Course Requirements and Grading

The grade will be calculated as follows:

  • Assignments: 40%
  • Project: 25%
  • Exam (Midterm: March 15 Tue, Final): 30%
  • Class participation: 5%

Homework

Lateness and Extensions

Homework is worth full credit at the beginning of class on the due date (later if an extension has been granted). It is worth at most 90% credit for the next 24 hours. It is worth at most 50% credit for the following 24 hours. It is worth 25% credit after that. If you need an extension, please ask for it (by sending email to the instructor) as soon as the need for it is known. Extensions that are requested promptly will be granted more liberally. You must turn in all assignments.

Collaboration among Students

The purpose of student collaboration is to facilitate learning, not to circumvent it. Studying the material in groups is strongly encouraged. It is also allowed to seek help from other students in understanding the material needed to solve a particular homework problem, provided no written notes are shared, or are taken at that time, and provided learning is facilitated, not circumvented. The actual solution must be done by each student alone, and the student should be ready to reproduce their solution upon request. Any form of help or collaboration must be disclosed in full by all involved on the first page of their assignment. In any case, you must exercise academic integrity.

Tentative Schedule

Assignments

Please carefully read the Lateness and Extensions section for policies.

  • Assignment 1 Due on Thursday, Jan 28.
  • Assignment 2 Due on Thursday, Feb 16.
  • Assignment 3 Due on Tuesday, March 15.
  • Assignment 4 Due on Tuesday, April 5.
  • Assignment 5 Due on Tuesday, April 19.
  • Assignment 6 Due on Tuesday, May 3 (Bonus, optional).

The schedule is tentative and may subject to change.

References

Machine Learning

  • Textbook: Pattern Recognition and Machine Learning, Christopher M. Bishop, 2006. Webpage

  • Reference book: The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Second Edition) by Trevor Hastie, Robert Tibshirani and Jerome Friedman (2009) Book

Linear Algebra and Matrix Computation

  • Gradient computation w.r.t. a vector/matrix

https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf

Basic Probability Theory

  • Shorter materials

http://ai.stanford.edu/~paskin/gm-short-course/lec1.pdf

http://www.sci.utah.edu/~gerig/CS6640-F2010/prob-tut.pdf

  • Longer books

http://mplab.ucsd.edu/tutorials/ProbabilityAndStats.pdf

https://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/amsbook.mac.pdf

Basic Optimization

  • Lecture notes from Andrew Ng:

http://cs229.stanford.edu/notes/cs229-notes1.pdf

  • If you are interested in systemtically studying the optimization knowledge, try reading the book Convex Optimization:

https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf

The basic gradient descent is decribed in Page 463 (Page 477 of the PDF file).

About

Michigan State University CSE847 Spring 2016

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages