Skip to content

gfloto/presentations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

Presentations

This repo contains presentations I've given on machine learning topics in the last 2 years.

D3M Lab

The majority of my presentations were a part of my computer science MSc in the Data-Drive Decision Making Lab (D3M).

Diffusion 1

Presentation here. In-depth presentation of diffusion models, primarily focusing on the DDPM paper. Some connections are made to the continuous SDE intepretation,.

Diffusion 2

Presentation here. A secondary presentation more focused on the continuous SDE interpretation of diffusion models in the first part of the presentation. The second part deals with practical applications and generic diffusion tasks such as denoising, inpainting, and super-resolution.

Geometry and Topology in Deep Learning

Presentation here. This presentation is a more general splash of topics. First, some basic concepts related to manifolds and visualization warm-ups. Intuition behind dimension estimation methods is introduced. Familiarity with the significance of the manifolds hypothesis is assumed here. Part 2 deals with constrains with allow us to determine an embedded sub-manifold that we can be certain our data manifold is contrained to. Part 3 deals with symmetries with further describe and constrain an embedded sub-manifold which our dataset must live on. Notions of invariance and equivariance are briefly discussed. Finally, the presentation concludes with some personal thoughts on the relationship between the MLE persepective behind modern generative modelling and it's relationship to the manifold hypothesis.

Differential Equations and Machine Learning

Presentation here. The presentation begins with an introduction to solving ODEs numerically, along with some different types of ODE solvers. Neural ODEs are discussed as a continuous limit of residual networks. The adjoint method and details are avoided. Flow matching and continuous normalizing flows are discussed in relation to diffusion models. The presentation concludes with a brief discussion on graph neural networks and continuous generalizations.

Tilted Prior

Presentation here. A rough overview of a paper I worked on. Problems related to the calibration of likelihood-based generative models are covered. A hypothesis related to the behaviour of the Gaussian distribution is high dimensions is introduced, and the tilted prior is proposed as a remedy. Finally, the "will-it-move" test is discussed as an additional method to help with out-of-distribution detection.

State-Space Models

Presentation here. A review of the HiPPO, S4 and Mamba papers, currently a hot topic in recurrent neural networks.

EleutherAI Diffusion Reading Group

I have also been active in the EleutherAI Diffusion Reading Group. I gave a presentation of flow matching and diffusion distillation

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published