How to handle Uncertainty in Deep Learning #1.2

  Рет қаралды 4,164

DeepFindr

DeepFindr

Күн бұрын

Пікірлер: 15
@trevormiller931
@trevormiller931 2 жыл бұрын
This is gold. Thanks for the thorough and great content!
@DeepFindr
@DeepFindr 2 жыл бұрын
Thanks!!
@robertchamoun7914
@robertchamoun7914 2 жыл бұрын
Thanks!
@jiahao2709
@jiahao2709 Жыл бұрын
One thing i want to say is that, test data usually only use once as test time, i think it is better call the "test data" as validation data.
@kenbobcorn
@kenbobcorn Жыл бұрын
It's probably worth mentioning you are computing gradients on your test set by not setting torch.no_grad() for the test loop. This series is all about uncertainty so it's important you aren't computing gradients on your test set which leaks into your mu and var values, which in the end is contrary to what you are trying to show.
@DeepFindr
@DeepFindr Жыл бұрын
Hi! Good remark. But as long as you aren't running back propagation w.r.t. to the test loss it won't leak any information into the model weights. Torch.no_grad is mainly used for memory reasons
@kevinkorfmann8780
@kevinkorfmann8780 2 жыл бұрын
thanks, that was really helpful! Looking forward to part 3.
@DeepFindr
@DeepFindr 2 жыл бұрын
Thank you! Next part is coming next week :)
@nguyenxuanthanh6988
@nguyenxuanthanh6988 Жыл бұрын
Brilliant!!! These videos help me a lot in understanding uncertainty. Could you make more videos regarding this topic? Thank you so much.
@shilpimajumder7917
@shilpimajumder7917 2 жыл бұрын
Thanks for the video...i learn a lot..Please upload some videos of uncertainty estimation in image classification.
@DeepFindr
@DeepFindr 2 жыл бұрын
Thanks! For image classification the same principles apply. You just have other layers (Conv2d) instead of Linear.
@clima3993
@clima3993 2 жыл бұрын
What if p(y|x) is not Gaussian? What is y is high dimensional?
@DeepFindr
@DeepFindr 2 жыл бұрын
There are alternative loss function for other distributions like Laplace ect. or you transform the target variable in some way to match a supported distribution. Multidimensionality is no problem for GaussianNNL Loss, it simply apply the calculation per dimesion and averages it.
@clima3993
@clima3993 2 жыл бұрын
@@DeepFindr Thanks for the helpful reply. A following up question: what about we use conditional generative model to handle aleatoric uncertainty?
@DeepFindr
@DeepFindr 2 жыл бұрын
Sure that's also a reasonable approach. You can learn the data distribution and detect out of distribution samples using generative models.
How to handle Uncertainty in Deep Learning #2.1
13:55
DeepFindr
Рет қаралды 6 М.
How to handle Uncertainty in Deep Learning #1.1
18:38
DeepFindr
Рет қаралды 12 М.
Une nouvelle voiture pour Noël 🥹
00:28
Nicocapone
Рет қаралды 4,3 МЛН
We Attempted The Impossible 😱
00:54
Topper Guild
Рет қаралды 55 МЛН
Каха и дочка
00:28
К-Media
Рет қаралды 3,3 МЛН
154 - Understanding the training and validation loss curves
27:47
DigitalSreeni
Рет қаралды 110 М.
Fake News Detection using Graphs with Pytorch Geometric
12:38
DeepFindr
Рет қаралды 15 М.
How to handle Uncertainty in Deep Learning #2.2
13:40
DeepFindr
Рет қаралды 3,3 М.
Uncertainty (Aleatoric vs Epistemic) | Machine Learning
10:18
TwinEd Productions
Рет қаралды 11 М.
Causality and (Graph) Neural Networks
16:13
DeepFindr
Рет қаралды 18 М.
How to get started with Data Science (Career tracks and advice)
14:15
MIT 6.S191: Evidential Deep Learning and Uncertainty
48:52
Alexander Amini
Рет қаралды 60 М.
Une nouvelle voiture pour Noël 🥹
00:28
Nicocapone
Рет қаралды 4,3 МЛН