Skip to content
Course materials for EPsy 8264
R TeX
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
assignments
data
notes-pdf
notes-rmd
scripts
README.md
f18-epsy-8264-syllabus.pdf

README.md

EPsy 8264: Advanced Multiple Regression Analysis

This GitHub repository is the home for the EPsy 8264: Advanced Multiple Regression Analysis course materials. You can access the course website at: https://github.com/zief0002/epsy-8264



Downloading the Course Materials

To download all of the materials simultaneously from this site, click on the Clone or Download button and select Download ZIP. This will download a ZIP file of the entire site on your local computer.

To download individual PDF files, open the file link and then click on the Download button. CSV files can be individually download by opening their links, clicking on the Raw button. This should display the text of the CSV in your browser window. If you right-click on this text, you should be able to Save as or Save page as (or something along those lines).



Instructor

Andrew Zieffler (zief0002@umn.edu)
Office: Education Sciences Building 178
Hours: Wednesdays 9:00am–10:00am; and by appointment


Teaching Assistant

Jonathan Brown (brow3019@umn.edu)
Office: Education Sciences Building 192 (or the table immediately outside Room 192)
Hours: Mondays and Thursdays 12:00pm–1:00pm; and by appointment


Course Content and Resources

Below is the required reading you will need to do for the course. Some are traditional reading (e.g., books, papers). Others are online videos or webpages to "read". For each topic, additional resources (and courses) are also provided for students interested in pursuing the topic further outside of this course.


The required textbook is:

  • Fox, J. (2015). Applied regression analysis and generalized linear models (3rd edition). Los Angeles: Sage.


Some R Resources

Here are some R resources:



Unit 01: Some Mathematics Relevant to Regression

In this unit we will explore common formulas and rules that will be used to prove and mathematiclly illustrate ideas in regression. In addition to the materials in the notes folder and what we cover in class, there many resources online. Google search terms include:

  • "Summation rules"
  • "Expectation rules"
  • "Covariance rules"
  • "Variance rules"


Unit 02: Some Theory Underlying Simple Linear Regression

In this unit we will derive and prove several properties of the OLS simple regression model. We will also examine how violations of the underlying assumptions of the regression model affect these properties. In addition to the materials in the notes folder and what we cover in class, there many resources online. Google search terms include:

  • "Properties of OLS estimators"
  • "Simple linear regression"
  • "Regression theory"


Unit 03: A Bit of Linear Algebra

In this unit we will learn some of the basic linear/matrix algebra that is useful for regression analyses. The notes for this unit are available as HTML slides at:

In addition to the materials in the notes folder and what we cover in class, there many other resources for learning linear/matrix algebra. Here are some resources that may be helpful in that endeavor:

  • 3Blue1Brown. (2016). Essence of linear algebra. (A series of YouTube videos that teaches linear algebra)
  • Burrill, G., Burrill, J., Landwehr, J. & Witmer, J. (1998). Advanced modeling and matrices Orangeburg, NY: Dale Seymour Publications. (This is a module from Data-Driven Mathematics written for secondary school students and teachers)
  • Fox, J. (2009). A mathematical primer for social statistics. Thousand Oaks, CA: Sage. (Available online via UMN Library)
  • Irizarry, R., & Love, M. (2018). Biomedical data science PH525x series. (Chapter 4 addresses matrix algebra)
  • Namboodiri, N. K. (1984). Matrix algebra: An introduction. Thousand Oaks, CA: Sage.

For more advanced matrix applications:

  • Brownlee, J. (2018). A gentle introduction to matrix factorization for machine learning. [Blog Post]. (Gives an introduction to matrix decomposition methods (e.g., LU, QR); these methods reduce a matrix into constituent parts that make it easier to calculate more complex matrix operations such as finding an inverse.)
  • Thomas, R., & Howard, J. (2017). Computational Linear Algebra. fast.ai. (An online course using Jupyter Notebooks and Python focused on the question: How do we do matrix computations with acceptable speed and acceptable accuracy?)


Unit 08: Collinearity Diagnostics

In this unit we will learn about some empirical diagnostics useful for detecting collinearity. The notes for this unit are available as an HTML file at:



Unit 09: Biased Estimation and Shrinkage

In this unit we will learn about ridge regression, a form of biased estimation, and how we can use it to alleviate collinearity. The notes for this unit are available as an HTML file at:

In addition to the class notes and what we cover in class, there many other resources for learning about shrinkage methods. Here are some resources that may be helpful in that endeavor:



Unit 10: Heteroskedasticity and WLS Estimation

In this unit we will learn about heteroskedasticity, and three methods to deal with this violation of the linear model's assumptions: (1) variance stabilizing transformations; (2) Weighted Least Squares (WLS) estimation; and (3) adjusting the SEs via sandwich estimation. The notes for this unit are available as an HTML file at:



Unit 11: Polynomial Regression

In this unit we will learn about one method for dealing with nonlinearity: polynomial regression. The notes for this unit are available as an HTML file at:



Unit 12: Cross-Validation

In this unit we will learn about cross-validation as a method for model selection. The notes for this unit are available as an HTML file at:

In addition to the class notes and what we cover in class, there many other resources for learning about cross-validation. Here are some resources that may be helpful in that endeavor:



Unit 13: Piecewise Regression Models

In this unit we will learn about piecewise models as a method for fitting local models. The notes for this unit are available as an HTML file at:

In addition to the class notes and what we cover in class, there many other resources for learning about piecewise models. Here are some resources that may be helpful in that endeavor:

  • Berk, R. (2016). Splines, smoothers, and kernels. Statistical learning from a regression perspective (2nd ed., pp. 55–127). New York: Springer.
  • Fox, J. (2016). Nonlinear regression. Applied regression analysis and generalized linear models (3rd ed., pp.502–527). Thosand Oaks, CA: Sage.
  • James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). Moving beyond linearity. An introduction to statistical learning: with applications in R (pp. 265–301). New York: Springer.
  • Statistical Learning MOOC taught by Hastie and Tibshirani


Unit 14: Regression Splines

In this unit we will learn about regression splines as a method for fitting local models. The notes for this unit are available as an HTML file at:

In addition to the class notes and what we cover in class, there many other resources for learning about piecewise models. Here are some resources that may be helpful in that endeavor:

  • Berk, R. (2016). Splines, smoothers, and kernels. Statistical learning from a regression perspective (2nd ed., pp. 55–127). New York: Springer.
  • Fox, J. (2016). Nonlinear regression. Applied regression analysis and generalized linear models (3rd ed., pp.502–527). Thosand Oaks, CA: Sage.
  • James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). Moving beyond linearity. An introduction to statistical learning: with applications in R (pp. 265–301). New York: Springer.
  • Statistical Learning MOOC taught by Hastie and Tibshirani


You can’t perform that action at this time.