Stanford CS236: Deep Generative Models I 2023 I Lecture 7 - Normalizing Flows

  Рет қаралды 886

Stanford Online

Stanford Online

18 күн бұрын

For more information about Stanford's Artificial Intelligence programs visit: stanford.io/ai
To follow along with the course, visit the course website:
deepgenerativemodels.github.io/
Stefano Ermon
Associate Professor of Computer Science, Stanford University
cs.stanford.edu/~ermon/
Learn more about the online course and how to enroll: online.stanford.edu/courses/c...
To view all online courses and programs offered by Stanford, visit: online.stanford.edu/

Пікірлер: 1
@CPTSMONSTER
@CPTSMONSTER 3 күн бұрын
8:00 Without KL term, similar to a stochastic autoencoder which takes an input and maps it to a distribution over latent variables 8:30 Reconstruction to resemble Gaussian, KL term encourages latent variables generated through encoder to be distributed similar to the prior distribution (Gaussian in this case) 10:00? Trick decoder 12:50? q also stochastic 14:10 Both p and q generative models, only regularizing latent space of an autoencoder (q) 15:10 Marginal distribution of z under p and under q seems like a possible training objective, intractable integrals 24:10? If p is a powerful autoregressive model, then z is not needed 32:05? Sample p of z given x, invert generative process, find z's likely under that posterior, intractable to compute 34:25? Sample from conditional, not selecting from most likely z 53:50 Change of variables formula 56:40 Mapping unit hypercube to parallelotope (linear invertible transformation) 59:10 Area of parallelogram is determinant of matrix 59:50 Parallelotope pdf 1:08 Non-linear invertible transformation formula, generalized to determinant of Jacobian of f. Dimension of x and z are equal, unlike in VAEs. Determinant of Jacobian of inverse of f is equal to inverse of determinant of Jacobian of f. 1:15:00 Worked example of non-linear transformation pdf formula 1:17:45 Two interpretations of diffusion models, stacked VAEs and infinitely deep flow models 1:21:20 Flow model intuition, latent variables z don't compress dimensionality, views data from another angle to make things easier to model
Stanford CS236: Deep Generative Models I 2023 I Lecture 8 - GANs
1:22:58
Разбудила маму🙀@KOTVITSKY TG:👉🏼great_hustle
00:11
МишАня
Рет қаралды 3,5 МЛН
格斗裁判暴力执法!#fighting #shorts
00:15
武林之巅
Рет қаралды 53 МЛН
Chips evolution !! 😔😔
00:23
Tibo InShape
Рет қаралды 27 МЛН
Stanford CS236: Deep Generative Models I 2023 I Lecture 1 - Introduction
57:28
Lecture 1 | The Fourier Transforms and its Applications
52:07
Stanford
Рет қаралды 1,2 МЛН
How I Understand Diffusion Models
17:39
Jia-Bin Huang
Рет қаралды 18 М.
Gaussian Mixture Models - The Math of Intelligence (Week 7)
38:06
Siraj Raval
Рет қаралды 137 М.
Lecture 1 | String Theory and M-Theory
1:46:55
Stanford
Рет қаралды 2,3 МЛН
2021 3.1 Variational inference, VAE's and normalizing flows - Rianne van den Berg
56:29
Mediterranean Machine Learning (M2L) summer school
Рет қаралды 13 М.
Stanford CS236: Deep Generative Models I 2023 I Lecture 10 - GANs
1:27:30
Разбудила маму🙀@KOTVITSKY TG:👉🏼great_hustle
00:11
МишАня
Рет қаралды 3,5 МЛН