Understanding Deep Learning Requires Rethinking Generalization

  Рет қаралды 5,829

UCF CRCV

UCF CRCV

Күн бұрын

Пікірлер: 9
@glitchAI
@glitchAI 4 ай бұрын
this talk is a gem.
@edoson01
@edoson01 6 жыл бұрын
Very clear, thanks! I think there is an error though in the linear models section - to have an infinite number of solutions to a linear model (equation system) , d ( the number of dimensions/variables) has to be larger than n (number of samples/equations), it's written the opposite in the presentation
@redberries8039
@redberries8039 6 жыл бұрын
very helpful ...nice and clear ..good job
@hamedgholami261
@hamedgholami261 Жыл бұрын
19:05 is the gaussian distribution conditioned on the class labels, or are the samples drawn from one single gaussian (regardless of the class) and then assigned a label?
@jimmyshenmusic
@jimmyshenmusic 4 жыл бұрын
Thanks for this talk. Very clear and informative.
@SKRithvik
@SKRithvik 4 жыл бұрын
Really nice presentation ! At 33:49 I think it must be d >= n. Could you please explain what e_t*x_i mean at 35:00 ? I am assuming that it means the partial derivative of the loss wrt w evaluated at x_i
@szpaku1999
@szpaku1999 3 жыл бұрын
That was video what i needed
@ProfessionalTycoons
@ProfessionalTycoons 5 жыл бұрын
thank you for the talk
@rubbermanburningflowers9204
@rubbermanburningflowers9204 6 жыл бұрын
what is the relationship between rank and infinite number of solution?
'How neural networks learn' - Part III: Generalization and Overfitting
22:35
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 887 М.
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 36 МЛН
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
Quilt Challenge, No Skills, Just Luck#Funnyfamily #Partygames #Funny
00:32
Family Games Media
Рет қаралды 55 МЛН
Understanding deep learning requires rethinking generalization
18:53
Tom Goldstein: "An empirical look at generalization in neural nets"
53:03
Institute for Pure & Applied Mathematics (IPAM)
Рет қаралды 6 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,7 МЛН
Lecture 7 - Deep Learning Foundations: Neural Tangent Kernels
1:14:52
Pixel Recurrent Neural Networks
29:34
UCF CRCV
Рет қаралды 9 М.
Tom Goldstein: "What do neural loss surfaces look like?"
50:26
Institute for Pure & Applied Mathematics (IPAM)
Рет қаралды 19 М.
Understanding Deep Learning Requires Rethinking Generalization - TWiML Online Meetup - Dec 2017
1:05:04
The TWIML AI Podcast with Sam Charrington
Рет қаралды 1,7 М.
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 36 МЛН