Рет қаралды 8,578
(All lesson resources are available at course.fast.ai.) In this lesson, Jeremy introduces Dropout, a technique for improving model performance, and with special guests Tanishq and Johno he discusses Denoising Diffusion Probabilistic Models (DDPM), the underlying foundational approach for diffusion models. The lesson covers the forward and reverse processes involved in DDPM, as well as the implementation of a noise predicting model using a neural network. The team also demonstrate an alternative approach to the implementation and discuss ways to improve training speed.
0:00:00 - Introduction and quick update from last lesson
0:02:08 - Dropout
0:12:07 - DDPM from scratch - Paper and math
0:40:17 - DDPM - The code
0:41:16 - U-Net Neural Network
0:43:41 - Training process
0:56:07 - Inheriting from miniai TrainCB
1:00:22 - Using the trained model: denoising with “sample” method
1:09:09 - Inference: generating some images
1:14:56 - Notebook 17: Jeremy’s exploration of Tanishq’s notebook
1:24:09 - Make it faster: Initialization
1:27:41 - Make it faster: Mixed Precision
1:29:40 - Change of plans: Mixed Precision goes to Lesson 20
Many thanks to Francisco Mussari for timestamps and transcription.