DeepMind x UCL | Deep Learning Lectures | 9/12 | Generative Adversarial Networks

  Рет қаралды 35,053

Google DeepMind

Google DeepMind

Күн бұрын

Пікірлер: 28
@leixun
@leixun 4 жыл бұрын
*DeepMind x UCL | Deep Learning Lectures | 9/12 | Generative Adversarial Networks (GANs)* *My takeaways:* *1. Overview: why are we interested in GANs **0:25* 1.1 GANs advances 4:22 1.2 Learning an implicit model through a two-player game: discriminator and generator 5:28 -Generator 6:38 -Discriminator 8:03 1.3 Train GAN 9:02 1.4 Unconditional and conditional generative models 41:18 *2. Evaluating GANs **43:52* *3. The GAN Zoo **50:55* 3.1 Image Synthesis with GANs: MNIST to ImageNet 51:46 -The original GANs 52:02 -Conditional GANs 53:16 -Laplacian GANs 54:08 -Deep convolutional GANs 57:30 -Spectrally Normalised GANs 1:00:20 -Projection discriminator 1:01:54 -Self-attention GANs 1:03:12 -BigGANs 1:04:49 -BigGANs-deep 1:11:24 -LOGAN 1:14:12 -Progressive GANs 1:15:38 -StyleGANs 1:16:58 -Summary: from simple images to large-scale database of high-resolution images1:19:23 3.2 GANS for representation learning 1:21:05 -Why GANs? --Motivation example 1: semantics in DCGAN latent space 1:21:28 --Motivation example 2: unsupervised category discovery with BigGANs1:22:16 -InfoGANs 1:23:59 -ALI/bidirectional GANs 1:25:54 -BigBigGANs 1:29:28 *3.3 GANs for other modalities and problems **1:33:05* -Pix2Pix: translate images of two different domains 1:33:18 -CycleGANs: translate images of two different domains 1:34:48 -GANs for audio synthesis: WaveGAN, MelGAN, GAN-TTS 1:36:19 -GANs for video synthesis and predication TGAN-2, DVD-GAN, TriVD-GAN 1:37:19 -GANs are everywhere1:39:10 --Imitation learning: GAIL --Image editing: GauGAN --Program synthesis: SPIRAL --Motion transfer: Everybody dance now --Domain adaptation: DANN --Art: Learning to see
@harshvardhangoyal5362
@harshvardhangoyal5362 3 жыл бұрын
mvp
@leixun
@leixun 3 жыл бұрын
@@harshvardhangoyal5362 Welcome to check out my research on my channel.
@shivtavker
@shivtavker 4 жыл бұрын
At 17:48 Why does KL(p, p^*) look like that? Divergence will be minimised when we have have p(x) as low as possible. So p can be a distribution that does very bad on both Gaussians.
@CSEAsapannaRakeshRakesh
@CSEAsapannaRakeshRakesh 4 жыл бұрын
@10:58 "We only do few steps of SGD for discriminator" Is it 1 k-sized step for 1-epoch (iteration)
@CSEAsapannaRakeshRakesh
@CSEAsapannaRakeshRakesh 4 жыл бұрын
@9:17 Why does Binary Cross Entropy function has no negative sign to it?
@CSEAsapannaRakeshRakesh
@CSEAsapannaRakeshRakesh 4 жыл бұрын
@10:12 Is it because we are "maximizing" D's prediction accuracy cost(D) = - cost(G)
@kirtipandya4618
@kirtipandya4618 3 жыл бұрын
Can we access code exercises?
@agamemnonc
@agamemnonc Жыл бұрын
Great lecture, thank you! One small note, I believe the terminology used "distance between two probability distributions" is not quite rigorous. Even KL-divergence is not really a distance metric as it is not symmetric.
@robertfoertsch
@robertfoertsch 4 жыл бұрын
Excellent, Added To My Research Library, Sharing Through TheTRUTH Network...
@lukn4100
@lukn4100 3 жыл бұрын
Great lecture and big thanks to DeepMind for sharing this great content.
@awadelrahman
@awadelrahman 4 жыл бұрын
Regardless to the extremely wonderful lecture!!!!! I am always wondering why GAN people have a very similar "talking" style and tone as Goodfellow!! @ Jeff :D ... Thanks a lot ;)
@mathavraj9378
@mathavraj9378 4 жыл бұрын
Could someone tell me why we call it "latent" noise? latent means something hidden right? so what is being hidden from the input noise?
@haejinsong1835
@haejinsong1835 4 жыл бұрын
The idea is that latent noise (which is the input to the generator) is not an observable variable. People often use "un-observable" / "hidden" / "latent" to refer to those variables which we do not have observed in the dataset. Cf. if we have a collection of images, the images are observable variables.
@mohitpilkhan7003
@mohitpilkhan7003 4 жыл бұрын
Its an amazing overview. Loved it very much. Thank you DeepMind and Love you.
@pervezbhan1708
@pervezbhan1708 2 жыл бұрын
kzbin.info/www/bejne/qJC0YmWLfsuAoqc
@GeneralKenobi69420
@GeneralKenobi69420 4 жыл бұрын
1:31:10 Lol are we just gonna ignore the pic of a woman wearing black latex pants? 👀 (Also do NOT zoom in on that picture in the bottom left... It's like some of the worst nightmare fuel I've ever seen in my life. JFC)
@quosswimblik4489
@quosswimblik4489 3 жыл бұрын
GANs are cool but what can you do with CIANs (clown and identifier adversarial networks networks). So you have one AI trying to identify things and another network trying to fool the identifying AI into making a mistake. The clown AI is trying to find holes in the mindset of the identifier as to give the Identifier a more general fit and is for training identification where as the GAN is the other way round trying to train the generator on a specific imitation task.
@jayanthkumar9637
@jayanthkumar9637 3 жыл бұрын
I just loved her voice
@sanjeevi567
@sanjeevi567 4 жыл бұрын
Wonderful thanks guys...GANs(Wow)
@Daniel-mj8jt
@Daniel-mj8jt 2 жыл бұрын
Excellent lecture!
@luksdoc
@luksdoc 4 жыл бұрын
A wonderful lecture.
@lizgichora6472
@lizgichora6472 3 жыл бұрын
Thank you, very interesting work cycleGAN translating domain.
@myoneuralnetwork3188
@myoneuralnetwork3188 4 жыл бұрын
If you'd like a beginner-friendly, easy to read guide to GANs and building them with PyTorch, you might find "Make Your First GAN With PyTorch" useful.. www.amazon.com/dp/B085RNKXPD All the code is open source on github github.com/makeyourownneuralnetwork/gan
@iinarrab19
@iinarrab19 4 жыл бұрын
Great. Only feedback is that she needs to master how to speak effectively as in when to properly pause and breath.
The evil clown plays a prank on the angel
00:39
超人夫妇
Рет қаралды 53 МЛН
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
MIT 6.S191: Reinforcement Learning
1:00:19
Alexander Amini
Рет қаралды 64 М.
MIT 6.S191 (2023): Deep Generative Modeling
59:52
Alexander Amini
Рет қаралды 312 М.
GEOMETRIC DEEP LEARNING BLUEPRINT
3:33:23
Machine Learning Street Talk
Рет қаралды 337 М.
A Small History of Big Evolutionary Ideas - Robin May
1:00:35
Gresham College
Рет қаралды 61 М.
DeepMind x UCL | Deep Learning Lectures | 2/12 |  Neural Networks Foundations
1:24:13
The evil clown plays a prank on the angel
00:39
超人夫妇
Рет қаралды 53 МЛН