PyTorch Tutorial 12 - Activation Functions

  Рет қаралды 38,269

Patrick Loeber

Patrick Loeber

Күн бұрын

Пікірлер: 26
@normalperson1130
@normalperson1130 4 жыл бұрын
Thanks. This couldn't have come at a better time. I've a project that needs Pytorch in my university and this will be greatly helpful
@patloeber
@patloeber 4 жыл бұрын
Thanks for watching! I'm glad it is helpful :)
@alteshaus3149
@alteshaus3149 3 жыл бұрын
Thank you for these great videos. I can really learn a lot and it helps me for my future career
@Ftur-57-fetr
@Ftur-57-fetr 3 жыл бұрын
Simple, clear, THANKS!!!!!!!!
@patloeber
@patloeber 3 жыл бұрын
Glad it helped!
@riadhossainbhuiyan4978
@riadhossainbhuiyan4978 6 ай бұрын
Great teaching ❤
@haoranwan1233
@haoranwan1233 2 жыл бұрын
Thanks!The best tutorial I have ever seen!
@mahmoudabbasi1994
@mahmoudabbasi1994 3 жыл бұрын
A great course. Thank you!
@DanielWeikert
@DanielWeikert 4 жыл бұрын
Thanks for all your effort. Appreciate it! Could you do a video on text generation with pytorch? Best regards
@patloeber
@patloeber 4 жыл бұрын
Thanks for watching! I have 4 more tutorials planned for this playlist: - feed forward net - convolutional neural net - transfer learning - tensorboard After that I want to do more practical applications, so yeah text generation is a nice suggestion :)
@beltraomartins
@beltraomartins 2 жыл бұрын
Thank you!
@nicolasgabrielsantanaramos291
@nicolasgabrielsantanaramos291 4 жыл бұрын
Thanks for the class!!!
@amrutapatil1417
@amrutapatil1417 3 жыл бұрын
great tutorial, thankyou so much!!!
@patloeber
@patloeber 3 жыл бұрын
You're very welcome!
@HoangNguyen-be4vy
@HoangNguyen-be4vy 4 жыл бұрын
As far as I'm concerned about your example, this NN has 3 layers: - 1 input layer - 1 hidden layer (uses ReLu function) - 1 output layer (size 1, uses Sigmoid function) Am I understanding it correctly? Correct me if I am wrong.
@patloeber
@patloeber 4 жыл бұрын
Exactly :)
@alirezamohseni5045
@alirezamohseni5045 5 ай бұрын
thanks, a lot😀😀😀😀
@VarunKumar-pz5si
@VarunKumar-pz5si 3 жыл бұрын
Please do some videos on Computer Vision projects ...................
@patloeber
@patloeber 3 жыл бұрын
Yes good suggestion!
@urosgrandovec3409
@urosgrandovec3409 11 ай бұрын
without activation functions, a composition of linears transformations is again a linear transformation
@anonim5052
@anonim5052 7 ай бұрын
great
@saurrav3801
@saurrav3801 4 жыл бұрын
Thanks bro I saw it
@abderrahmanebououden5173
@abderrahmanebououden5173 4 жыл бұрын
thanks bro :)
@joywang8173
@joywang8173 Жыл бұрын
Does anyone knows why tanH is a good option for hidden layers? i'm new to deep learning😊
@saimakhalil5137
@saimakhalil5137 3 ай бұрын
The tanh function outputs values between -1 and 1, which can help in centering the activations around zero. tanh is also non-linear, which allows neural networks to learn complex relationships in data.Tanh has stronger gradients than sigmoid, particularly around zero. This can facilitate learning and convergence, especially in deeper networks.Unlike sigmoid, which outputs values between 0 and 1 centered around 0.5, tanh outputs values between -1 and 1 centered around 0. This can make optimization easier because weights and biases can be updated in both directions in the weight space.
@bluebox6307
@bluebox6307 3 жыл бұрын
If anybody wonders (like i did) if there is any difference between instantiating nn.Sigmoid() or calling torch.sigmoid(), check out the answer from KFrank in discuss.pytorch.org/t/torch-nn-sigmoid-vs-torch-sigmoid/57691/3 :) (and according to discuss.pytorch.org/t/is-there-any-different-between-torch-sigmoid-and-torch-nn-functional-sigmoid/995, torch.nn.functional seems to be deprecated by now)
PyTorch Tutorial 13 - Feed-Forward Neural Network
21:34
Patrick Loeber
Рет қаралды 90 М.
PyTorch Tutorial 11 - Softmax and Cross Entropy
18:17
Patrick Loeber
Рет қаралды 90 М.
Это было очень близко...
00:10
Аришнев
Рет қаралды 7 МЛН
When mom gets home, but you're in rollerblades.
00:40
Daniel LaBelle
Рет қаралды 134 МЛН
Family Love #funny #sigma
00:16
CRAZY GREAPA
Рет қаралды 38 МЛН
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 64 МЛН
A Review of 10 Most Popular Activation Functions in Neural Networks
15:59
Machine Learning Studio
Рет қаралды 12 М.
Mastering Activation Functions in PyTorch: A Deep Dive Tutorial
15:46
Ryan & Matt Data Science
Рет қаралды 281
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 335 М.
The StatQuest Introduction to PyTorch
23:22
StatQuest with Josh Starmer
Рет қаралды 159 М.
Activation Functions - EXPLAINED!
10:05
CodeEmporium
Рет қаралды 119 М.
Neural Networks Explained from Scratch using Python
17:38
Bot Academy
Рет қаралды 346 М.
RBF Networks
20:10
macheads101
Рет қаралды 52 М.
Это было очень близко...
00:10
Аришнев
Рет қаралды 7 МЛН