L18.4: A GAN for Generating Handwritten Digits in PyTorch -- Code Example

  Рет қаралды 6,576

Sebastian Raschka

Sebastian Raschka

3 жыл бұрын

Slides: sebastianraschka.com/pdf/lect...
Code: github.com/rasbt/stat453-deep...
This video discusses 04_01_gan-mnist.ipynb
-------
This video is part of my Introduction of Deep Learning course.
Next video: • L18.5: Tips and Tricks...
The complete playlist: • Intro to Deep Learning...
A handy overview page with links to the materials: sebastianraschka.com/blog/202...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 7
@algorithmo134
@algorithmo134 Ай бұрын
Does using a double for-loop like the one mentioned in the GAN original paper by Goodfellow easier to implement in practice? For example, we freeze training the generator when we train the discriminator and vice-versa.
@lewforth7147
@lewforth7147 5 ай бұрын
hello, professor, thanks for the video. but I am confused about the z in the generator_forward part at around 5:58. you said you created a vector z (C * H * W) first , then flatten it in start_dim=1. but in self.generator, the first input size is latent_dim = 100. why it is not 1*28*28 (C * H * W) ? thanks
@candylauuuu219
@candylauuuu219 Жыл бұрын
Hi sir. Can I use this code for custom image dataset? What is the file type that MNIST images fed into the dataloader and whole GAN training process actually?
@kafaayari
@kafaayari 2 жыл бұрын
Hello Mr. Raschka. Thank you very much for the great lecture. I have a question though regarding necessity of detach. At 15:58, you say that it will influence the generator. But when setting up optimizers, we made two specific optimizers for generator and discriminator, and selected only relevant NN parameters respectively. Why is detach operation still needed?
@SebastianRaschka
@SebastianRaschka 2 жыл бұрын
Yeah, that's a good point. It wouldn't update the generator params because those are not part of the discriminator-optimizer. However, I would definitely still use .detach(). (1) for efficiency reasons I.e., if you don't use it it would build the computation graph for the generator, which is wasteful. (2) And the computation graph is only destroyed when you call backward(). So, what that means is that you probably will also get weird results because it would already construct a computation graph involving generator parameters before you call the generator.
@kafaayari
@kafaayari 2 жыл бұрын
@@SebastianRaschka Ah, now I see. Thank you very much professor!
@Facts-The-universe
@Facts-The-universe 3 ай бұрын
Sir i am also working on handwritten character generation using GAN for marathi character.. Please suggest me can i reuse your code and from where can i get it.. Please reply me
L18.5: Tips and Tricks to Make GANs Work
17:14
Sebastian Raschka
Рет қаралды 3,4 М.
Василиса наняла личного массажиста 😂 #shorts
00:22
Денис Кукояка
Рет қаралды 10 МЛН
버블티로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 99 МЛН
Diffusion models from scratch in PyTorch
30:54
DeepFindr
Рет қаралды 234 М.
GAN using Keras | mnist gan | gan on mnist | generate fake images
18:19
[Classic] Generative Adversarial Networks (Paper Explained)
37:04
Yannic Kilcher
Рет қаралды 60 М.
L18.1: The Main Idea Behind GANs
10:43
Sebastian Raschka
Рет қаралды 3,1 М.
248 - keras implementation of GAN to generate cifar10 images
31:09
DigitalSreeni
Рет қаралды 16 М.
L19.1 Sequence Generation with Word and Character RNNs
17:44
Sebastian Raschka
Рет қаралды 7 М.
126 - Generative Adversarial Networks (GAN) using keras in python
33:34
L18.3: Modifying the GAN Loss Function for Practical Use
18:50
Sebastian Raschka
Рет қаралды 6 М.
Simple maintenance. #leddisplay #ledscreen #ledwall #ledmodule #ledinstallation
0:19
LED Screen Factory-EagerLED
Рет қаралды 19 МЛН
YOTAPHONE 2 - СПУСТЯ 10 ЛЕТ
15:13
ЗЕ МАККЕРС
Рет қаралды 163 М.
Мой инст: denkiselef. Как забрать телефон через экран.
0:54